I'm currently working with Lukeprog on a crash course in rationality. It's essentially a streamlined version of the Sequences, but one area we want to beef up is the answer to the question, "Why learn about rationality?"

I've gone through all of the previous threads I can find on this topic -- Reflections on rationality a year out, Personal benefits from rationality, What has rationality done for you?, and The benefits of rationality -- but most of the examples people give of rationality helping them are a little too general. People cite things like "I hold off on proposing solutions," or "I ask myself if there's a better way to be doing this."

To someone who's not already sold on this whole rationality thing, general statements like that won't mean very much. What I think we really need is a list of concrete examples of how the tools of epistemic rationality, as they're taught in the Sequences, can improve your health, your career, your love life, the causes you care about, your psychological well-being, and so on.

Below, my first attempt at doing just that. (I explain what rationality is, and how to practice it, elsewhere in the guide -- this section is just about benefits.) I'd appreciate feedback: Is it clear? Can you think of any other good examples in this vein? Would it be convincing to someone who isn't intrinsically interested in epistemic rationality for its own sake?

 

...

For some people, rationality is an end in itself – they value having true beliefs. But rationality’s also a powerful tool for achieving pretty much anything else you care about. Below, a survey of some of the ways that rationality can make your life more awesome:

Rationality alerts you when you have a false belief that’s making you worse off.

You’ve undoubtedly got beliefs about yourself – about what kind of job would be fulfilling for you, for example, or about what kind of person would be a good match for you. You’ve also got beliefs about the world – say, about what it’s like to be rich, or about “what men want” or “what women want.” And you’ve probably internalized some fundamental maxims, such as: When it’s true love, you’ll know. You should always follow your dreams. Natural things are better. Promiscuity reduces your worth as a person.

Those beliefs shape your decisions about your career, what to do when you’re sick, what kind of people you decide to pursue romantically and how you pursue them, how much effort you should be putting into making yourself richer, or more attractive, or more skilled (and skilled in what?), more accommodating, more aggressive, and so on.

But where did these beliefs come from? The startling truth is that many of our beliefs became lodged in our psyches rather haphazardly. We’ve read them, or heard them, or picked them up from books or TV or movies, or perhaps we generalized from one or two real-life examples.

Rationality trains you to notice your beliefs, many of which you may not even be consciously aware of, and ask yourself: where did those beliefs come from, and do I have good reason to believe they’re accurate? How would I know if  they’re false? Have I considered any other, alternative hypotheses?

Rationality helps you get the information you need.

Sometimes you need to figure out the answer to a question in order to make an important decision about, say, your health, or your career, or the causes that matter to you. Studying rationality reveals that some ways of investigating those questions are much more likely to yield the truth than others. Just a few examples:

“How should I run my business?” If you’re looking to launch or manage a company, you’ll have a huge leg up over your competition if you’re able to rationally determine how well your product works, or whether it meets a need, or what marketing strategies are effective.

“What career should I go into?” Before committing yourself to a career path, you’ll probably want to learn about the experiences of people working in that field. But a rationalist also knows to ask herself, “Is my sample biased?” If you’re focused on a few famous success stories from the field, that doesn’t tell you very much about what a typical job is like, or what your odds are of making it in that field.

It’s also an unfortunate truth that not every field uses reliable methods, and so not every field produces true or useful work. If that matters to you, you’ll need the tools of rationality to evaluate the fields you’re considering working in. Fields whose methods are controversial include psychotherapy, nutrition science, economics, sociology, management consulting, string theory, and alternative medicine.

“How can I help the world?” Many people invest huge amounts of money, time, and effort in causes they care about. But if you want to ensure that your investment makes a difference, you need to be able to evaluate the relevant evidence. How serious of a problem is, say, climate change, or animal welfare, or globalization? How effective is lobbying, or marching, or boycotting? How far do your contributions go at charity X versus charity Y?

Rationality teaches you how to evaluate advice.

Learning about rationality, and how widespread irrationality is, sparks an important realization: You can’t assume other people have good reasons for the things they believe. And that means you need to know how to evaluate other people’s opinions, not just based on how plausible their opinions seem, but based on the reliability of the methods they used to form those opinions.

So when you get business advice, you need to ask yourself: What evidence does she have for that advice, and are her circumstances relevant enough to mine? The same is true when a friend swears by some particular remedy for acne, or migraines, or cancer. Is he repeating a recommendation made by multiple doctors? Or did he try it once and get better? What kind of evidence is reliable?

In many cases, people can’t articulate exactly how they’ve arrived at a particular belief; it’s just the product of various experiences they’ve had and things they’ve heard or read. But once you’ve studied rationality, you’ll recognize the signs of people who are more likely to have accurate beliefs: People who adjust their level of confidence to the evidence for a claim; people who actually change their minds when presented with new evidence; people who seem interested in getting the right answer rather than in defending their own egos.

Rationality saves you from bad decisions. 

Knowing about the heuristics your brain uses and how they can go wrong means you can escape some very common, and often very serious, decision-making traps.

For example, people often stick with their original career path or business plan for years after the evidence has made clear that it was a mistake, because they don’t want their previous investment to be wasted. That’s thanks to the sunk cost fallacy. Relatedly, people often allow cognitive dissonance to convince them that things aren’t so bad, because the prospect of changing course is too upsetting.

And in many major life decisions, such as choosing a career, people envision one way things could play out (“I’m going to run my own lab, and live in a big city…”) – but they don’t spend much time thinking about how probable that outcome is, or what the other probable outcomes are. The narrative fallacy is that situations imagined in high detail seem more plausible, regardless of how probable they actually are.   

Rationality trains you to step back from your emotions so that they don’t cloud your judgment.

Depression, anxiety, rage, envy, and other unpleasant and self-destructive emotions tend to be fueled by what cognitive therapy calls “cognitive distortions,” irrationalities in your thinking such as jumping to conclusions based on limited evidence; focusing selectively on negatives; all-or-nothing thinking; and blaming yourself, or someone else, without reason.

Rationality breaks your habit of automatically trusting your instinctive, emotional judgments, encouraging you instead to notice the beliefs underlying your emotions and ask yourself whether those beliefs are justified.

It also trains you to notice when your beliefs about the world are being colored by what you want, or don’t want, to be true. Beliefs about your own abilities, about the motives of other people, about the likely consequences of your behavior, about what happens after you die, can be emotionally fraught. But a solid background in rationality keeps you from flinching away from the truth – about your situation, or yourself -- when learning the truth can help you change it.

 

New Comment
41 comments, sorted by Click to highlight new comments since: Today at 2:38 PM

If you want impact, use the narrative fallacy. What I mean is, use all of the other biases and fallacies you listed - tell a story about John, the guy who met a cool scientist guy when he was in primary school and now his life goal is to be a scientist. He decides to do work on global warming because 'what could be more important than this issue?' He expects to live in the city, be the head of a big lab... But he's not very good at global warming science (maybe he's not very good at research?), and he doesn't seem to notice that the advice his colleagues give him isn't helping. So he sticks to his guns because he's already got a degree in global warming, but he's always stressing about not having a job...

And so on.

And then rewind. John discovers rationality when he's a young adult, and becomes John-prime. Compare John to John-prime, whose rationality training allows him to recognise the availability bias at work on his dream of being a scientist, and since scholarship is a virtue, he researches, interviews... discovers that politics is a much better fit! His rationality informs him that the most important thing is improving quality of life, not global warming or power, so he donates to third-world charities and ensures when he runs for political positions he does so on a platform of improving social welfare and medical access. His rationality lets him evaluate advice-givers, and he manages to see through most of the self-serving advice - and when he finds a mentor who seems genuine, he sticks to that mentor, improving his success in politics...

And so on.

(And then the punchline: explain why this story makes the audience feel like rationality is important with a description of the narrative bias!)

I like this approach.

A related idea, when at some point in the future someone runs a rationality seminar that costs money (a reasonable service to offer) the marketing pitch ends with:

"Now, if this were a regular sales pitch, we'd end by saying 'normally we charge $500 for this workshop, but this month we're actually having a discount, so you can get it for $400"

Beat.

"What I just did was called anchoring. Saying a number causes your brain to use that number as a reference point, whether you want it to or not. $400 sounds like a good deal compared to $500. You're probably going to have difficulty putting a value on the workshop now that ISN'T based off of those numbers. This technique is used by marketing all time, from coupons to car salesmen. Us? We just straight up charge $300."

Beat.

If someone chimes up with "Hey, you just used anchoring AGAIN!" they get a $50 discount.

[-][anonymous]12y00

So basically, use dark side mind control tricks to convince them of something we couldn't otherwise, but then claim it's OK because we reveal the tricks at the end?

It doesn't need to be "OK".

something we couldn't otherwise [convince them]

I'm not sure this is true - I think we could convince them using other methods - but in either case, why tie our hands behind our back if we're trying to win?

[-][anonymous]12y-10

why tie our hands behind our back if we're trying to win?

  1. Because it's unethical. I don't think it's so important to convince uninterested people that we should resort to unethical methods.

  2. If we use unethical mind control tricks whose success is not correlated with the strength of our arguments, we lose an opportunity to discover that maybe we aren't ready to be convincing people. What if we are wrong? What if rationality is not developed enough to have the results speak for themselves? How would we know?

    The fact that dark side mind control tricks look attractive is evidence that the art is not developed enough that we should even be trying to convince people of its effectiveness. When the art is ready, we will not have to convince people; they will be asking how we do it.

I don't think it's so important to convince uninterested people that we should resort to unethical methods.

If behaving ethically is more important in your ethics than helping people avoid huge mistakes that hurt them - like, say, choosing alternative therapies instead of something that actually cures a disease and dying because the side effects of the treatment are more available to your brain than the concept of dying - then I don't think much of your ethics.

If there was a pill that would make people more rational, I'd be slipping it in their food without telling them. I'd be injecting it into the water supply. I'd be taking a huge dose and donating blood. Because there are people out there that refuse vaccinations, there are people out there that take alcohol and painkillers together, there are people out there that make simple silly mistakes and die. And that's wrong.

[-][anonymous]12y10

First of all, what do you think of protected from myself?

We are not talking about slipping people some miracle cure that they are just being stupid about not taking. If that were the case, you would be right. At this point we don't actually know that it is a miracle cure and we are just slipping them some dubious substance that shows promise but may or may not help. We need more interested people to develop the art, but not the kind of people who will only be convinced by dark side mind control tricks.

Maybe when LW rationality is at a point where reasonable people could be convinced with empirical evidence, then it will be a good idea to trick the rest.

Ethics isn't just about right and wrong, it's also about not doing stupid shit that's going to bite you in the ass.

Very Ericsonian. I like it!

the tools of epistemic rationality, as they're taught in the Sequences, can improve your health, your career, your love life, the causes you care about, your psychological well-being, and so on.

I'm skeptical. The Less Wrong canon is great for training a particular set of widely-applicable abstract thinking skills, but that's not the same thing as domain-general awesomeness. See Yvain's 2009 post "Extreme Rationality: It's Not That Great." The sort of people who are receptive to this material aren't primarily being held back by insufficient rationality: the problem is akrasia, the lack of motivation to carry our gloriously rational plans.

One might argue that it is by means of rationality that we will discover and implement effective anti-akrasia techniques. Yes, I hope so, too. But I haven't gotten it to work yet.

Lukeprog and Julia are pretty good examples of how rationality awesomely affects someone who's not afflicted by akrasia as strongly as many of us. Finding a general remedy for akrasia is still a major unsolved problem in the rationalist community, of course.

[-][anonymous]12y60

Anecdotal. Showing how rationality could improve their lives if only they were this way that they are not is not productive. Stinks as hard as "your prayer didn't work because your faith wasn't strong enough".

Analogy:

Person 1: "Penicillin isn't that great- it hasn't helped my flu at all."

Person 2: "It's had awesome results for people with bacterial infections, but it doesn't seem to help with viral ones."

Person 3: "How dare you blame Person 1 for having the wrong kind of infection!"

Person 2: "What the hell?"

[-][anonymous]12y10

well analogized.

You still shouldn't be peddling penicillin as a miracle cure. Likewise with LW rationality.

Except that there are no qualities a person can have that will get prayers to work.

[-][anonymous]12y10

Good point. Do you think non-rationalist people will be able to make that distinction?

I expect everyone who doesn't believe in god would be able to, not all of whom are "rationalist".

That aside, why do you ask? I'm a bit confused by your question.

Never mind; I was doing it wrong.

So, akrasia is not longer a significant problem or obstacle in your life?

No, sorry, that's not what I meant. It's more like---previously, I must have been implicitly thinking of "rationality" as being about verbal intellectual discourse, like the sort of thing we do here. Whereas now it's as if I'm finally starting to glimpse this idea of probability and decision theory as constraints on coherent behavior, with speaking and writing merely being particular types of human behavior that happen to be particularly salient to us, even though the real world is made out of simpler parts that we don't usually think about.

This is a good summary, but a post like this is greatly strengthened by links to external resources to justify or expand upon the claims it makes. If I didn't know anything about the topic, some of the text would be unclear to me, and I would want the ability to click around and learn more. For example:

  • What is the sunk cost fallacy? (Link to wikipedia/LWwiki)
  • There is some recent evidence about rationality as a treatment for depression

Also, I think one of the first reactions a typical person will have is, "Rationality? Of course I'm rational." To start from square one on this topic, you have to explain to people that, surprisingly enough, they aren't. Politely, of course. Then you can start talking about why it's important to work on.

All that said, I think the examples given are great; they're salient problems for most people, and you can make a good case that rationality will improve one's outcomes for those problems.

Rationality opens the door to self-improvement:

All of us here know Tsuyoku Naritai. Some, maybe most people, though, take a certain perverted delight in admitting their flaws without any intent to fix them. Deeply involved here is both of the types of rationality we talk about; you need to know your own flaws (and not just profess that you have them and they exist), and you need to know what to do about them, in order to become better. The way will also teach you that merely trying is not enough; you must make a convulsive effort to accomplish even the smallest amount. Spencer Greenberg's blog is dedicated to this art. Take a look at it! Clear, practical advice for problems that lots of people have. Rationality opens the door and helps you walk through it. Even so, it isn't easy. If you try very hard and fail, you really have done worse than not trying at all. But the alternative is owning your flaws, thinking yourself better than you are.

[-][anonymous]12y80

Are we sure that at the current level of development, rationality even does positively impact life? Besides a few huge anecdotal gains, where are the statistics?

I think the LW approach to rationality is currently missing a huge piece of the chain that should connect epistemology to real-world effectiveness.

These discussions of how rationality should make you more awesome are reminiscent of discussions about how jesus should make your life better. Keyword should. Don't try to argue that rationality should make your life better until we figure it out to the point where we have evidence that it does. Outside view is that rationality is just another weird belief system claiming huge theoretical gains with no evidence.

I won't pretend that rationality makes me more awesome until people are asking me why I'm so awesome.

Until then, let's capture the people who might be interested and develop the art.

Clearly written. You might also want to borrow concrete examples from the Luminosity sequence. The principle of reductionism applied to states of mind suggests that we start with reasonable priors about our personalities and then empirically determine how our emotions respond to circumstances. One might be surprised to learn that vacations make them anxious, not relaxed; that certain diets make them sad or happy; that visiting one's aunt makes them energized, not bored. With this self-knowledge (and some instrumental rationality), one might be able to become happier, like people on purpose, overcome difficulties in adopting a new lifestyle, and become more who they want to be in divers ways.

Fighting indecisiveness:

If you are unsure about what action is best, there is still a best action that will very often resemble strenuous effort towards a single goal, just as if you knew with certainty what action was best. Two related but distinct concepts leading to this conclusion are the expected Value of Information and the principle that "... the optimal strategy is to behave lawfully, even in an environment that has random elements."

So we see that at less than infinite levels of certainty that something ought to be done, it may still be that that thing ought to be done with full focus.

Did i understand correctly that you want us both to review your text and add specific examples that we can think of?
I will do both.

On the text:

I liked it very much, but I don't think the text works very well for people who do not see rationality as a virtue.
Some problems i see when i try to put on the glasses of my anti-rationality friends:

  • The use of many in-crowd words and assumed meanings: hypothesis, fallacy, cognitive (non-rationality fan people do not use these words in daily life); What could be done is provide links to your definition. I believe we should keep definitions for these words on the lesswrong wiki because not all dictionaries agree on what all of them mean exactly, or what we mean by them.
  • I know many people who will deny any claim that they are in some way faulty or that emotions are a bad thing, unfortunately i do not know of a good way to get around this.
  • I get the feeling that the whole body of text is somewhat on the negative side: "Rationality will protect you from the cold harsh world" is the feeling i get.

On personal experience with applied rationality:
Example1:
I learned on lesswrong how an hypothesis should be used and how to use experimentation to collect evidence for or against it. Using the scientific method i formed the hypothesis that something in my food was making me have to go to the bathroom all day long (for the past 15 years). So i started keeping a food diary where i noted what i ate at what time and at what time i had to visit the bathroom and if the visit was normal or not.
Eventually a pattern began to form and after about a month of taking notes it became clear that Chili pepper seemed to be the causation, but at this point it could merely be a correlation. (i had once blamed corn, the doctors did not agree but i could clearly see the causation with my irrational eyes, as it turns out i never eat corn without chili pepper, so it was only a correlation) So i formed a new hypothesis: When I eat chilly i will get into trouble and then ran tests on that. So I removed chilly from my diet completely. (and the problems all went away), then to test i ate a big bowl of hot chili pepper soup, and in no time i was running to the bathroom again.

Example2:
(This one is about school an learning, i will be talking about a level of school similar to highschool. We use a grading system of 1(worst) to 10(best). The type of class i was in is what in the states would be considered a special school/class for gifted children)
When i got to "High School" i quickly found myself being teased about my learning abilities. With the notable exceptions of Excercise/Gym and Handwriting i was a straight 10 student, always had been. (I'm that guy that corrects mistakes in the schoolbook and the teachers explanation)
Although i pretended the teasing didn't hurt me, i only recently (with the rationality lessons of lesswrong) started to realise that they did hurt.
What happened was that i started to dislike school, getting 10's made me unpopular so something inside me snapped and i started dumbing down to be more "cool"
I still had 8's for everything and the teasing stopped.
But then something worse happened. In the 3rd grade of High School they changed the teachers for Math and we got a new one. This teacher was not a teacher. Instead it was a math genius that knew how to get the results/proofs but had no idea why. I had always relied on learning a concept through the way of asking why, mapping it to my existing knowledge and then integrating it. but this teacher expected me to "guess the teachers password" and learn math like a copyprinter.
I couldn't do it, quickly i went from an 11 average (i never dumbed down on math) to 2-3 average, not long after i quit school completely and started working for minimum wage.

From that moment on i believed i was unable to learn, the experience shocked me and scared me. I have been unable to study anything since.

Through the lesswrong sequences and advice from regulars on lesswrong i have managed to pinpoint my learned helplessness cause and overcome it. I have learned more than i have in the past 10 years since coming to lesswrong.

Some smaller examples:

  • I have done a google scholar, and google regular research to figure out what the best oral health strategy is according to the "better" studies (on average they all suck though)
  • I have picked up my university study that i have been procrastinating on for 5 years and am seeing good progress and most importantly, retention.
  • I have earned an excellent rating at work for self improvement in communication.
  • I have earned an excellent rating at work for using my rationality skills to massively improve the quality of questions asked at work. Before we would accept any claim, now we only accept questions that have empirical evidence of adding value.(this went from 60% effectiveness to 94%)

It's spelled "chili". I don't know whether it would be worth your while to find out what's in chili that upsets your gut-- there may be specific ingredients (beans? hot pepper?) you want to avoid in other dishes.

Congratulations for getting that much good from thinking about what you're doing.

Thanks for that fix, i updated my post to correct that. I mean this fruit/spice specifically: http://en.wikipedia.org/wiki/Chili_pepper

I assume I'm allergic to the capsaicin in it, as i can eat Bell peppers and http://en.wikipedia.org/wiki/Black_pepper without any effects.

[-][anonymous]12y20

I believe we should keep definitions for these words on the lesswrong wiki because not all dictionaries agree on what all of them mean exactly, or what we mean by them.

I agree!

And talking about the Wiki we really should find a way for contributors there to be rewarded with karma for their efforts. Also why in the world do we require separate registration for the Wiki and the main site?

I agree completely on the additional wiki requirements.

There should be no difference account/Karma wise between posting on the main site or the wiki.

[-][anonymous]12y30

Below, a survey of some of the ways that rationality can make your life more awesome

I don't know why but such use of "awesome" is in my mind firmly linked to the geek and even gaming cluster. I'm not a native speaker, but I seem to run across its use in such a manner only on those parts of the internet. :)

Rationality trains you to step back from your emotions so that they don’t cloud your judgement.

Rationality breaks your habit of automatically trusting your instinctive, emotional judgements, encouraging you instead to notice the beliefs underlying your emotions and ask yourself whether those beliefs are justified.

You did a good job with this section. Yet I wish to emphasise that by even discussing this aspect of why rationality is good for you, leaves the reader vulnerable to imagining straw vulcans. Everyone "knows" emotions and rational thinking don't mix. What they don't really know is that acting rationally means acting in a way to maximise your chances of "winning". Or what "winning" means in different contexts!

Yes I know you've given the reader ample clues and even implicit definitions of the how of rationality we use here on LW, and which basically is the reasonable one, but one can't expect a few sentences he's just read to have the same impact as a lifetime of exposure to affects, stereotypes and cached thoughts associated with "logic" or "rationality" this early. As another commenter said:

Also, I think one of the first reactions a typical person will have is, "Rationality? Of course I'm rational." To start from square one on this topic, you have to explain to people that, surprisingly enough, they aren't. Politely, of course. Then you can start talking about why it's important to work on.

[-][anonymous]12y30

You did a good job with this section. Yet I wish to emphasise that by even discussing this aspect of why rationality is good for you, leaves the reader vulnerable to imagining straw vulcans.

I just basically linked you to your own talk. Well that's what I get for not pausing to consider the user name.

So when you get business advice, you need to ask yourself: What evidence does she have for that advice, and are her circumstances relevant enough to mine? The same is true when a friend swears by some particular remedy for acne, or migraines, or cancer. Is he repeating a recommendation made by multiple doctors? Or did he try it once and get better? What kind of evidence is reliable?

I think you missed the most common and obvious failure mode: when people try something lots of times and never succeed.

A distressingly large portion of the population thinks that repeatedly trying and failing at something makes them experts. It's so universal as to be cliche: the person with pimples who knows all about how to get rid of acne, the person who's been divorced five times who gives advice on how to build a good relationship, the overweight person who's an expert on diet and fitness, etc. And they aren't just giving advice on what doesn't work. It's true that experience counts for a lot, but it does not trump results.

It would be understandable if it were just people deluding themselves; it seems normal that there should be a strong self serving bias to protect our ego. Yet people continue to take advice from those who are the least qualified to give it.

Shokwave got it more or less in narrative form. Thinking rationally gives you a shot at breaking path dependence before you get too far gone to turn back.

Rationality alerts you when you have a false belief that’s making you worse off.

The framing of this section (not just the title) makes it look like an invitation to motivated skepticism, which is generally a bad idea.

My framing was meant to be encouraging you to disproportionately question beliefs which, if false, make you worse off. But motivated skepticism is disproportionately questioning beliefs that you want to be false. That's an important difference, I think.

Are you claiming that my version is also a form of motivated skepticism (perhaps a weaker form)? Or do you think my version's fine, but that I need to make it clearer in the text how what I'm encouraging is different from motivated skepticism?

The implicit idea is that any improvement in beliefs is beneficial, but it's not what comes to mind when reading that section, it sounds as if suggesting that there is this special kind of beliefs whose revision would be beneficial, as opposed to other kinds of beliefs (this got me confused for a minute). So the actual idea is to focus on belief revisions with high value of information. This is good, but probably needs to be made more explicit and distanced a bit from the examples representative of a different idea (inconvenient beliefs that you would like to go away).

[-][anonymous]12y00

If you focus on questioning the beliefs whose presence is particularly inconvenient, that's genuine motivated skepticism (motivations could be different). I think this section needs to be revised in terms of value of information, so that there's symmetry in what kinds of change of mind are considered. Focus on researching the beliefs that, if changed, would affect you most (in whatever way). Dispelling uselessly-hurting prejudices is more of a special case of possible benefits than a special case of the method.

[This comment is no longer endorsed by its author]Reply

There's at least one other thread similar to the ones you linked, although most of the content is also pretty general.

[-][anonymous]12y00

.