Comment author: Vaniver 26 November 2014 03:14:17PM 4 points [-]

Welcome!

I like the moderated format and technically leaning of this site, though to be honest my readings over the last few days indicate the discussions are more like a debate room than a crowd-sourced problem-solving machine. I'm not saying that is bad, but I can't help but wonder where the "action verbs" will come into the game.

Function follows form; a forum website mostly leads to forum-style discussions. But other things are going on in forms that are more conductive to action verbs, like physical meetups or workshops run by CFAR. (And the changes mostly happen in the lives of people reading the site, like you could see in the bragging threads or rationality diaries.)

Comment author: SystemsGuy 27 November 2014 03:23:27AM 1 point [-]

Thank you for the welcome!

I will review CFAR, as at a glace it has some significant clients and at least some success.

There are no meetups near me, it seems.

I appreciate the feedback.

Comment author: ChristianKl 26 November 2014 04:29:03PM 2 points [-]

I like the moderated format and technically leaning of this site, though to be honest my readings over the last few days indicate the discussions are more like a debate room than a crowd-sourced problem-solving machine.

If you have a specific problem that you want to get solved that you think fits the website, feel free to open a thread in discussion.

But I don't think there's no problem solving. Out of the first site at the moment there are:

1) Request for suggestions: ageing and data-mining (The thread is about chosing how the OP focuses his scientific research which is a practical problem)

2) Breaking the vicious cycle (Solving a community problem, that important for some members)

3)The Centre for Effective Altruism is hiring to fill five roles in research, operations and outreach (Recruitment is a clear practical problem)

4) I just increased my Altruistic Effectiveness and you should too (Shares a practical technique about increasing the size of donations)

5) Shop for Charity: how to earn proven charities 5% of your Amazon spending in commission (Practical technique for increasing money going to charity)

6) Memory Improvement: Mnemonics, Tools, or Books on the Topic? (Sharing of practical techniques)

7) I Want To Believe: Rational Edition (Sharing of practical techniques)

8) Financial Effectiveness Repository (Sharing of practical techniques)

9) How to build the skill and the habit of experimentation? (Sharing of practical techniques)

I don't consider that a bad output.

Comment author: SystemsGuy 27 November 2014 03:18:25AM 0 points [-]

Thank you for the welcome!

This site is unusually populated with internal links -- that must take some discipline for the posters (and either good search tools or good memories, or both!).

I will review your links, and I much appreciate your sharing.

Comment author: SystemsGuy 26 November 2014 02:50:45AM 2 points [-]

Hi all. I'm a seasoned engineer, BSEE plus MS in Systems Engineering, with a couple of decades in electronics systems architecture, team management, and now organization management. I'm a big picture guy who can still somewhat do the math, but not really much anymore (ahhh, back in the day.......). Myers-Briggs says I'm an INTJ.

I've had some classes and additional practical experience in decision theory, statistics, communications theory, motivation, common biases and fallacies, utility, and such basics. I am beset with an interest in almost everything technical (I'm a T person, with the depth in electronics systems and the breadth in general engineering and technical topics), but heavily skewed to applied technology, not research. The observable world to me seems to be horridly sub-optimized, largely to human short-sightedness and apparent inability to plan ahead or see the bigger picture of their actions. I much like games and what-ifs. Favorite quotes include Einstein's "you can't solve problems with the same level of thinking that created them", an unattributed "people are not rational creatures, but rationalizing", and one I use to limit analysis-paralysis "I can afford to be wrong, but not indecisive".

I am individualistic and introverted by nature, but I've become more socially conscious and communicative as I've progressed in my career and life with wife and kids. I'm here because I'd like for the world to be a more rational place, especially for my children, but honestly my expectations for success are low. I like the moderated format and technically leaning of this site, though to be honest my readings over the last few days indicate the discussions are more like a debate room than a crowd-sourced problem-solving machine. I'm not saying that is bad, but I can't help but wonder where the "action verbs" will come into the game.

Comment author: fhe 10 September 2010 11:08:37AM *  30 points [-]

I can think of at least 3 ways that people fail to make strategic, effective decisions.

  1. (as the above post pointed out) it's difficult to analyze options (or even to come up with some of them), for any number of reasons: too many of them (and too little time), lack of information, unforeseeable secondary consequences, etc.. One can do one's best in the most rational fashion, but still comes out with a wrong choice. That's unfortunate, but if this is the only kind of mistakes I am making, i am not too worried. it's a matter of learning better heuristics, building better models, gathering more data... or, in the limit, admitting that there's a limit to how much human intelligence and limited time/resources can go, even if correctly applied to problems.

  2. A second, more worrisome, mistake is not to even realize that one can step out of one's immediate reactions, stop whatever one's doing, and think about the rationality of it, and alternatives. This mistake differs from (1). As a hypothetical example, suppose the wannabe comedian generated a list of things he could do, and decided to watch the Garfield cartoon. His choice might be wrong, but it's a conscious, deliberate choice that he made. This is mistake of type (1).

Suppose however, the Garfield idea was the first thing that came to his mind, and after 3 months he was still at it, never stopping to question his own logic. This is mistake of type (2).

Type (2) is more worrisome, because there doesn't seem to be a reliable way that, if left alone, one can break out of it. Douglas Hofstadter (of GEB fame) invented a word "sphexishness", which I think describes this vividly. It's a wonderful label, and I use it to catch myself in the act. Hofstadter coined the word from sphex's (digger wasp) inability to break out from a fixed routine of laying eggs when disturbed by human. Hofstadter gave a spectrum of sphexish behaviors, from a stuck music record to teenagers addicted to video games to mathematicians applying the same trick for new discoveries. (Hofstadter, Metamagical Themas. "On the seeming paradox of mechanizing creativity").

A lot of the 'unstrategic' decisions people make smell of sphexishness. (decision here is a misnomer, as it's a lack of conscious decision that lead them to taking ill-effective actions.)

How do you correct mistakes of such a type? It requires self-awareness. Some kind of an interrupt to break one out of a loop. The ability to spot patterns in unexpected places. Ways to help yourself: hang out with intelligent, observant people (who would do you the favor to point it out for you; return the favor when you see others trapped in such a behavior). Try to develop a mental habit of self-watching.

3.There is yet a third way that people don't do what's best for them: unlike in (1) & (2), they know what they should do, but just can't bring themselves to doing it. Taking the aspiring comedian example again. Does he really think watching Garfield is the best thing to do? I doubt it. He might know that going to an open mike event is better learning, but it's so painful (the anxiety of first time performers, fear of failure) that he procrastinates -- and in the worst way too, by doing something that seems like progress (so he doesn't feel guilty from it), but actually is very ineffective. (The irony is that, the mind is actually doing the rational thing, but on a small scale: pain avoidance. but of course on the larger scale this is detrimental to individual survival, hence irrational.)

This is a situation where the best choice is not hard to figure out, but so difficult (often the difficulty is psychological, but difficult nonetheless) that the mind avoids it. The solutions seems to trick the mind into undertaking it. E.g. some people avoid thinking of taking on a large project (because it would be overwhelming), but work on small pieces of it until they built up momentum (in the form of confidence, or having made too much investment to turn back, or having expectations places on them...).

I suspect type (3) exist because rationality is a recently evolved phenomenon. Our psychology is still by and large that of an unconscious, reactive animal. Rationality and consciousness have to fight every step of the way against some hundreds of thousands of years (much longer if you count the time when we were fish and even before that) of evolved behaviors that were once useful and hard-wired.

Yet therein lies hope too. If we can find the right tricks, push the primitive buttons, we can get such amazing, barbaric, uncontrollable motivation and energy out of ourselves. The buttons might be designed for something else, but our intelligence can use them to achieve what we know is good for us. The image is using sex to encourage people to learn and act rationally (I have no idea how that might work). But the hope is that, consciousness triumphs over the lizard brain in us.

Comment author: SystemsGuy 25 November 2014 07:18:43PM *  1 point [-]

Once I held passing interest in Mensa, thinking that an org of super-smart people would surely self-organize to impact the world (positively perhaps, but taking it over as a gameboard for the new uberkind would work too). I was disappointed to learn that mostly Mensa does little, and when they get together in meatspace it is for social mixers and such. I also looked at Technocracy, which seemed like a reasonable idea, and that was different but no better.

Now I'm a few decades on in my tech career, and I have learned that most technical problems are really people problems in disguise, and solving the organization and motivational aspects are critical to every endeavor, and are essentially my full-time job. What smoker or obese person or spendthrift isn't a Type 3, above? Who doesn't absorb into their lives with some tunnel vision and make type 2 mistakes? Who, as a manager, hasn't had to knowingly make a decision without sufficient information? I know I have audibly said, "We can't afford to be indecisive, but we can afford to be wrong", after I make such decisions, and I mean it.

Reading some of these key posts, though, points out part of the problem faced in this thread: we're trying to operate at higher levels of action without clear connections and action at lower levels. http://lesswrong.com/lw/58g/levels_of_action/

We have a forum for level 3+ thinking, without clear connections to level 1-3 action. The most natural, if not easy, step would be to align as a group in a fashion to impact other policy-making organizations. To me, we are perfecting a box of tools that few are using; we should endeavor to have ways to try them out and hone the cutting edges, and work then to go perform. A dojo approach helps with this by making it personal, but I'm not sure it is sufficient nor necessary, and it is small-scale and from my newbie perspective lacking shared direction.

Take dieting, for a counter example: I can apply rationality and Bayesian thinking to my dietary choices. I recall listening to 4-4-3-2 on Sat morning cartoons, and I believed every word. I read about the perils of meats and fat, and the benefits of vegetable oils and margarine. I heard from the American Heart Association to consume much less fat and trade out for carbs. I learned from the Diabetes Association to avoid simple carbs and use art'f sweeteners. Now I've learned not to blindly trust gov'ts and industries, and have combined personal experience, reading, and internet searching to gain a broader viewpoint that does not agree with any of the above! Much such research is a sifting and sorting exercise at levels 2-4, but with readily available empirical Level 1options, as I can try out promising hypotheses upon myself. As I see what works, and what doesn't, I can adapt my thinking and research. Anybody else can too.

Would a self-help group assist my progress? Well, an accountability group helps, but it isn't necessary. Does it help to "work harder" at level 1 alone? No....key improvements for me have come with improving my habits and managing desire, and then improving how I go about improving those. Does it help to have others assisting at level 3 and up? To an extent, it is good to share via e-mail and anecdote personal experiences, books, and thoughts.

The easy part is the vision, though -- I want to be healthier, lighter, stronger, and live longer. Seems pretty clear and measurable -- weight, blood pressure, cholesterol, 1-mile run time, bench-press pounds.

So what is the vision here? What are our relevant and empirically measurable goals?

Comment author: Zenkat2 02 August 2007 02:14:23PM 7 points [-]

I too am both a pagan and a scientist, and I will happily switch between tales of the Green Mother's handfasting to the Dying King and Gould's theory of Punctuated Equilibrium. I find it no more ridiculous than Francis Collins, the leader of the Human Genome Project and a man I respect greatly, publicly embracing evangelical Christianity.

Our brains are complex creations, with many levels and conflicting functions. The scientific method, with its falsifiable hypotheses and reductive materialism, is a stellar belief system for those systems responsible for predicting and understanding how the physical world works. Unfortunately, it provides little if any support for those pre-rational, emotional, and social systems all our brains share. Your amygdala needs something a bit different than physics.

Many of my fellow biologists share your confusion when confronted with the common person's dislike of Darwinian theory. What they fail to understand is that creation myths serve a critical function in people's lives that has *nothing* to do with what "really happened". Think about the function of belief from an evolutionary perspective for a moment. What survival benefit is there in understanding what "really happened" when the universe was formed billions of years ago -- especially to our ancestors on the savanna? Yet *all* cultures place a great importance in their creation myths, despite the fact that most can be easily disproved. It is a universal in human experience.

You may want to ask yourself what the evolutionary function of a creation myth is, and why they are a universal human conceit. With that knowledge in hand, you may have a better understanding of how a creation myth should be judged, and you may finally understand what your pagan panelist was trying to tell you.

Comment author: SystemsGuy 24 November 2014 02:52:42AM *  0 points [-]

Some individuals (and I presume more here than most venues) struggle with any internal inconsistency, while others readily compartmentalize and move on. I am an engineer by training and of course most of my workmates are engineers, yet they represent a variety of religions as well. Most have some questions and doubts about their own, and plenty more about others, and yet that doesn't make a huge difference for day-to-day life.

Some would quickly conclude that such an engineer's judgement is questionable, and discount their work, but most seem to be adequately logical in other spheres.

Perhaps the better questions is one of utility -- what value does the individual get for their beliefs? I graduated with many Elect Engrs; let's presume one went to work on microprocessor design (driven by quantum theory) and another does correction math for GPS satellites (driven by relativity). It is well understood that the two theories have been objectively demonstrated to work well in their respective domains, and yet are mathematically incompatible (at best each may a simplification of a more universal rule). Both cannot be 'true', and while both could be false and likely are to some degree, they are both incredibly useful.

From a systems perspective I tend to fall back on the Systems rules-of-thumb, like "all models are wrong; some are useful", and "draw a box around what is working together to do what you're interested in, and analyze within". Compartmentalization allows one to get down to the work at hand, in support of a utilitarian view.

I am here to learn, though. Must inconsistency be driven out, or simply embraced as part of the imperfect human machine?

Comment author: SoullessAutomaton 08 April 2009 10:28:04PM 3 points [-]

Is that desirable? (Not saying you're implying it is.) The community could probably benefit from some smart humanities types.

I was actually trying to imply that it isn't desirable, so yes, I agree fully.

Comment author: SystemsGuy 23 November 2014 06:32:49PM 0 points [-]

First post, so I'll be brief on my opinion. I would say "it depends". To communicate between people and even to clarify one's own thoughts, a formal language, with an appropriate lexicon and symbols, is a key facilitator.

As for desirability of audience, the About page says "Less Wrong is an online community for discussion of rationality", with nothing about exclusivity. I would suggest that if a topic is of the sort that newbies and lay people would read, then English is better; if more for the theorists, then math is fine.