Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique.

Carl Sagan had a slogan: "Extraordinary claims require extraordinary evidence." He would say this phrase and then explain how, when someone claims something extraordinary (i.e. something for which we have a very low probability estimate), they need correspondingly stronger evidence than if they'd made a higher-likelihood claim, like "I had a sandwich for lunch." We can talk about this very precisely, in terms of Bayesian updating and conditional probability, but Sagan was able to get a lot of this across to random laypeople in about a minute. Maybe two minutes.

What techniques for rationality can be explained to a normal person in under five minutes? I'm looking for small and simple memes that will make people more rational, on average. Here are some candidates, to get the discussion started:

Candidate 1 (suggested by DuncanS): Unlikely events happen all the time. Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle -- but events occur all the time that are just as unlikely. If you look at how many highly unlikely things could happen, and how many chances they have to happen, then it's obvious that we're going to see "miraculous" coincidences, purely by chance. Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.

Candidate 2: Admitting that you were wrong is a way of winning an argument. (The other person wins, too.) There's a saying that "It takes a big man to admit he's wrong," and when people say this, they don't seem to realize that it's a huge problem! It shouldn't be hard to admit that you were wrong about something! It shouldn't feel like defeat; it should feel like success. When you lose an argument with someone, it should be time for high fives and mutual jubilation, not shame and anger. The hard part of retraining yourself to think this way is just realizing that feeling good about conceding an argument is even an option.

Candidate 3: Everything that has an effect in the real world is part of the domain of science (and, more broadly, rationality). A lot of people have the truly bizarre idea that some theories are special, immune to whatever standards of evidence they may apply to any other theory. My favorite example is people who believe that prayers for healing actually make people who are prayed for more likely to recover, but that this cannot be scientifically tested. This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured. I think that if you pointed out a few examples of this kind of special pleading to people, they might start to realize when they're doing it.

Anti-candidate: "Just because something feels good doesn't make it true." I call this an anti-candidate because, while it's true, it's seldom helpful. People trot out this line as an argument against other people's ideas, but rarely apply it to their own. I want memes that will make people actually be more rational, instead of just feeling that way.

 

This was adapted from an earlier discussion in an Open Thread. One suggestion, based on the comments there: if you're not sure whether something can be explained quickly, just go for it! Write a one-paragraph explanation, and try to keep the inferential distances short. It's good practice, and if we can come up with some really catchy ones, it might be a good addition to the wiki. Or we could use them as rationalist propaganda, somehow. There are a lot of great ideas on Less Wrong that I think can and should spread beyond the usual LW demographic.

New to LessWrong?

New Comment
237 comments, sorted by Click to highlight new comments since: Today at 11:45 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I knew I was going to stay on LessWrong when I read the conceptually & rhetorically brilliant:

Ignorance exists in the map, not in the territory. If I am ignorant about a phenomenon, that is a fact about my own state of mind, not a fact about the phenomenon itself. A phenomenon can seem mysterious to some particular person. There are no phenomena which are mysterious of themselves. To worship a phenomenon because it seems so wonderfully mysterious, is to worship your own ignorance.

Which could be perhaps reduced to something like:

Your thoughts are your map; reality is the territory. Watch your step.

and

Mystery is always in the mind, never in the matter.

I don't think these are all that great but I would love a snappy way to express this central insight.

From Avoiding Your Belief's Real Weak Points:

"To do better: When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what smart people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole."

Condensed: If you catch yourself flinching away from a thought because it's painful, focus on that thought and don't let it go. If the truth hurts, it should.

This is, I think, some of the most important rationalist advice I ever got. It kept me reading OB when it was getting very painful to do so, and allowed me to finally admit that my religion was immoral, a thought I had kept tucked away in rationalization-land since middle school.

Hi. I'm new here. Great blog. Great post.

One maxim that I rely on for acting rationally is "know what your time is worth". In my first real job, I was on a one-week project with a scientist who told me that my time is valuable (I think he was implying that my boss was wasting my time). This really opened up my eyes. My first application of this idea was professionally -- I can get more out of my job than just a paycheck. I can learn skills and make contacts and list accomplishments that will advance my career. I can also enjoy what I do (I'm a researcher, so that's assumed in my profession). It's sad to see colleagues who think that their time is worth no more than some measly paycheck.

The second application of this rule was in my "home economy". I used to be very cheap. Now that I've placed a $ value on my time, it puts a lot of activities in perspective and I am much freer spending money when it frees up time for more worthwhile pursuits (it helps that my cheap habits assure that I always have a nice cushion of cash around. This way, I am able to spend money when needed, without reworking my budget -- which would be a real waste of my precious time). It's sad ... (read more)

Great idea for a post and an important topic. A somewhat similar topic came up at our recent Chicago meetup, when someone who saw our sign came up to us to ask us what Less Wrong referred to. We didn't necessarily have a great answer at the ready besides relaying some of the basics (website/group blog about rationality and thinking better, etc.). We spent a few minutes afterward talking about what information a good LW elevator speech might include. We didn't want it to sound too stilted/formal, e.g., "refining the art of human rationality" from the banner at the top doesn't sound that inviting in casual conversation. Does anyone have approaches that have worked?

Eliezers "Absence of evidence is evidence of absence" is a good one in my opinion, and relatively easy to explain the relevant maths to pretty much anyone.

The general point about Conservation of Expected Evidence is then likely to come out in the wash (and is a very useful idea).

A simple technique I used to use was that whenever I started to read or found a link for an article that made me uncomfortable or instinctively want to avoid it, I forced myself to read it. After a few times I got used to it and didn't have to do this anymore.

[-][anonymous]14y170

Should we be listing the oldies here as well? One of my favorites is still "Don't believe everything you think."

That one made it to book-title and t-shirt status, but I've never heard anyone actually say it. I've read it only a couple of times.

6SilasBarta14y
As true as that is, I don't see how it would lead people to do anything differently -- don't most people already think, er, believe they're living up to whatever that quote asks of them?
5thomblake14y
I don't think so. I'm pretty sure most people labor under the impression they have something like a unity of consciousness, so while "don't believe everything you see" might seem obvious, "don't believe everything you think" does not, unless specifically considering situations like hallucinations (which many would categorize under "see" rather than "think"). ETA: That's why this is a cornerstone of rationality. Even I am moved to remember the slogan, so that when I think to say, "That's not true!" I stop and ask myself why I think so and whether I should believe this impulse of mine.

Okay, in that case, I had come up with with a saying to express that same idea but which makes the implications clearer. Here goes:

"Blindness isn't when you see nothing; it's when you see the same thing, regardless of what's in front of you.

"Foolishness isn't when your beliefs are wrong; it's when you believe the same thing, regardless of what you've seen."

1thomblake14y
I particularly like the first, since the second clause technically includes literal blindness. I might change "wrong" to "false" when repeating the second.
3SilasBarta14y
Thanks! Any help with touching up my version so it flows better is much appreciated. Yes, I think this is particularly important, because the cognition involved in literal seeing is a form of believing: your brain is making inferences before there's even an image in your mind. (The raw retinal data looks like garbage.)
3Nornagest13y
I estimate most people would lump "don't believe everything you think" into the space occupied by slogans like "think different" and "question authority"; i.e. at best a generalized endorsement of counterculture ideals, and at worst a cynical attempt to break down any and all ideals in hopes that the gap will be filled by something more congenial to the speaker. The general population is familiar with ideology and unfamiliar with abstract cognition, so unqualified ideas about ideas will usually be taken to refer to the former. This misconception could be dissolved with half a minute of explanation, but that half minute wouldn't fit on a bumper sticker.
0SilasBarta13y
Thanks, you said what I was thinking so much better.
3phaedrus13y
This reminds me of "It is the mark of an educated mind to be able to entertain a thought without accepting it." -- Aristotle To me, this uses "educated" in the sense it ought to be meant.
1sketerpot14y
Can you think of a way to explain that to people so they may be able to apply it themselves? It's a nice slogan, but a clever turn of phrase isn't too useful by itself.
3[anonymous]14y
Frank Lantz spends the first five minutes of this video explaining the slogan and suggesting a way to apply it.
6NancyLebovitz14y
Thanks for the link. For me, the most interesting thing is that Lantz doesn't appear to be retarded [1], and yet it was a huge shock for him to find out as an adult that it was possible to think about the odds of a decision being right rather than assuming that decisions were absolutely right or wrong. I have no doubt that my description of needing years to assimilate the idea that people are really different from each other without this necessarily indicating something the matter with any of them is equally shocking to people who've been vividly aware of psychological differences as long as they can remember. Or I could be wrong-- the variation in clue distribution might be one of the things such people are apt to be clear about. [1] He actually seems pretty smart-- but "doesn't appear to be retarded" is the only way I can think of to adequately express my surprise that it took him so long to acquire that particular clue.

I think it is mostly hopeless trying to teach rationality to most people.

For example, both of my parents studied Math in university and still have a very firm grip of the fundamentals.

I just got a phone call yesterday from my father in Germany saying: "We saw in the news, that a German tourist couple got killed in a shooting in San Francisco. Will you avoid going out after dark?" When I tried to explain that I won't update my risk estimates based on any such singular event, he seemed to listen to and understand formally what I said. Anyhow, he was completely unimpressed, finishing the conversation in an even more worried tone: "I see, but you will take care, won't you?"

"don't worry - that sort of thing is so rare, when it happens, it makes the news!"

Well said! Here's how Bruce Schneier put it:

Remember, if it’s in the news don’t worry about it. The very definition of news is “something that almost never happens.” When something is so common that it’s no longer news — car crashes, domestic violence — that’s when you should worry about it.

I wrote an essay about the utter irrationality of "stranger danger" based on that quote: http://messymatters.com/strangers

5NancyLebovitz14y
I think not worrying about things in the news needs some fine-tuning-- if a war is happening where you live, it will affect your safety level, and it will be in the news.
2simplicio14y
That's the canonical response now! Thanks!

Your parents aren't saying "Please update your estimate of the probability of your violent death, based on this important new evidence."

The are saying, "I love you."

This has nothing to do with how rational or irrational they are.

They're saying "I love you" in an irrational way. This can hurt because there is no easy way to quibble with the second part and not violate cultural conventions about how to express your acceptance of the first.

This is well-understood by irrationalists. Once in a discussion about the necessity of evidence, I got landed with "But you don't demand evidence that your wife loves you, right? You just have faith..."

A clever move. Now arguing the point requires me to... deny that I have faith in my wife?

'Why would I need to demand evidence? My wife freely gives me evidence of her love, all the time!'

I had a similar discussion with a family member, about the existence of the Christian god, where I received that exact response. My wife was sitting right there. I responded with something along the lines of, "True, but my 'faith' in her love is already backed up by evidence, and besides, I have plenty of evidence that she exists. If there was evidence for God and evidence of His love, I would happily put faith in that too."

But I agree - it definitely caused me to pause to consider a tactful response.

7wedrifid14y
And the proper name for a wife that doesn't freely give evidence of her love is an ex-wife!

And for someone who doesn't require evidence to believe in that love - a stalker!

8Paul Crowley14y
So religious people are all God's stalkers?
2simplicio14y
My reply was in this vein, essentially. But it's still a sneaky bugger of a question.
5Eliezer Yudkowsky14y
See also, "The Riddle of Kyon".
2simplicio14y
It was good! I didn't realize you had other fanfic than HP:MoR.
-1Baughn14y
He has quite a few more. Go look for the sword of good, for example..
0simplicio14y
Yeah, I should have said I didn't know there were anymore apart from the ones on LW and HPMOR. Brain fart.
3Paul Crowley14y
Ah, the old "agree with me or say something rude!" gambit. I wonder if you could turn it around - "what, are you saying you don't think my wife loves me?"
2Oligopsony14y
The error, of course, is that it elides between two meanings of "faith." You trust your wife because you have (one would hope) spent a great deal of time with her and found her to be honest, concerned about your well-being, &c. Of course, you might at some point come upon evidence that this is not warranted, and in this case the irrationalists might have a point: it may be more wise to use motivated cognition to convince yourself that she is faithful or still in love with you. Othello can be read as an extended argument for avoiding reasonable conclusions if you know that your reactions are not guaranteed to be reasonable.
5simplicio14y
Ah, but you see, that cannot be put into a test tube. And as all of your least educated neighbours know, if you can't put it into a test tube, it ain't evidence.

Consider the possibility that when people say they're seeing things differently than you do, that they might be telling you the truth. They could be making it up, they could be just annoying you for the fun of it, but they might actually be weirder than you think.

Do you have any examples? That's a fascinating one.

(Corollary: if you're angry at someone, and they ask why you're angry, tell them. They might actually not know. Especially if they're a child. I know I'm not the only one who was punished by one or more elementary school teachers for reasons that they refused to explain, since they assumed that I already knew. Oh how I seethed.)

6SilasBarta14y
Yeah, that pretty much describes growing up for me. "Don't do that." Why not? "How dare you disrespect my authority you little terr..." Oh, no, I'm perfectly fine with obeying, I just wanted to know the rationale so I can identify what kinds of things are off-limits ... "TIMEOUT! Now!" Edit: Needless to say, even on this forum, there are people who have no qualms about telling others "Don't do that" without bothering to spell out the boundary, or even understand why that would be necessary. I can't understand what motivates such people beyond, "I like it when others are in a perpetual state of uncertainty and have to keep deferring to me for permission."

"How dare you disrespect my authority you little terr..."

You raise an interesting point here. When a parent or teacher imposes their authority on a child, there are two very different goals they could have:

  1. To get the child to comply, and/or

  2. To establish their own dominance.

When you ask why you're being ordered to do something, and you happen to be beneath the age that society considers you a real person, that's taken as an attack on the dominance of the person bossing you around. Obedience isn't enough; a lot of people won't be satisfied with anything less than unquestioning obedience, at least from mere children. I suspect that this is what people are thinking most of the time when they use "because I say so" as a 'reason' for something. (The rest of the time, they're probably using it because they're feeling too harried to explain something to a mere child, and so they trot out that tired old line because it's easy.)

I remember when I was young enough that adults dared to treat me that way. (Notice the emotionally charged phrasing? I'm still irritated.) Someone who gave reasonable orders and provided justifications for them on request, got cooperation... (read more)

providing a reason for your instructions doesn't hurt anything

I tend to agree in most cases. However, not all instruction-givers have good reasons for their orders. If they must provide such reasons before they are obeyed, and only inconsistently have them, that means that a plausible motive for their subordinates to question them is the desire not to follow the instruction. (i.e. subordinate thinks there might be no good reason, feels compelled to obey only if there is one, and is checking.) The motive associated in this way with asking for reasons is therefore considered harmful by the instruction-giver.

When I was a kid and got an unobjectionable but confusing order, I usually agreed first and then asked questions, sometimes while in the process of obeying. This tended to work better than standing there asking "Why?" and behaving like I wanted the world to come to a halt until I had my questions answered. Objectionable orders I treated differently, but I was aware when I challenged them that I was setting myself up for a power struggle.

9Eliezer Yudkowsky14y
Harry may or may not get a chance to say this at some point, but it sure is going in my quotes file.
9thomblake14y
Not true. In many cases, there isn't time (or some other resource) for spelling out your reasons. And when it's a life-or-death situation, you want your child to comply with your orders unquestioningly, not stand there asking "why" and get eaten by a lion.

These concerns can be balanced better than they usually are by using something like a "Merlin says" rule.

Such a rule would include an expectation of later justification, of course.

7sketerpot14y
The reference, in case anybody was wondering.
3thomblake14y
That sounds plausible, but I've never seen it attempted in practice. Though it doesn't sound very different from "Because I say so!" so I don't see why it would work worse.
5Psy-Kosh14y
"because I say so" invokes the very fact of the demand as the supreme reason, rather than acting as a promissory note, saying "no time to explain now, but trust me there's a good reason that I'll explain later" ie, "because I said so" is "bow to my authority, underling" rather than "in this specific circumstance, just do it, trust me (for now) there's a reason, and ask later if it's not obvious to you by then"
1sketerpot14y
Okay, I will admit that there are some situations where telling someone why is impractical. I don't think they're too frequent, though, unless you live in a place with a lot of lions (or whatever).
8thomblake14y
Most parents and children live in places with a lot of potentially-deadly situations.
6khafra14y
For a comparison with modern adults who live in places with a lot of potentially-deadly situations requiring swift obedience, US military personnel are required to obey all lawful orders from those appointed over them, but have (from the order follower's side) several channels for reporting abuses of authority, and (from the order giver's side) official guidance with ways of explaining orders when time permits.
4Eliezer Yudkowsky14y
I think that statement becomes a lot stronger if you say "most of your ancestors".
5RobinZ14y
Possibly, although most parents and children live in places with automobiles.
5Eudaimoniac14y
I am a parent and I have to disagree with you. The worst case scenario is not that it is worthless. If a child learns to question the "order" given out loud, it would suggest that the child is also questioning the "order" internally. This leads to the internal debate whether to ask for a justification for the "order" or internally decide if it is justifiable or not. Now you have a situation where the child does not stop up and ask for the justification, but in stead decides that some situations cannot be justifiable and thus will not ask for said justification. When the parents are around, this is problematic, but when no authoritative figure is close this leads to the child questioning already given "orders" and possible overruling any preexisting justification. They are children after all. Now you have a child who actively disregards (or might disregard) "orders" given - with or without justification. Sure, you told your daughter not to go with strangers, but the stranger had candy and instead of seeking out parents to gain a justification for the rue of not going with strangers, the child will examine the justification itself and given an upbringing with minimal trauma, might follow the stranger with the candy. You either have to demand absolute obedience or allow for your child to make its own decisions and accept the danger and risk involved with that, but it is a wrong simplification to say that the worst that can happen is that it is useless. After all - the way you parent your child shapes them - good or bad.
2wedrifid14y
I agree Eudaimoniac (nice name by the way!). The worst case scenario is definitely less than worthless. The question of what is best in the average case would be an interesting one. My hunch is that it depends on the neurology of the child and also on the nature of the culture. Expectations of and relationship with 'justification' vary quite a lot between individuals in a way that I trace down to genetics.
3DanArmak14y
In addition to what others have said, I think the very concept of 'authority figure' for most people means 'one who is obeyed without question'. The meaning of 'order' does not include a possibility of questioning it. An instruction that comes with explanations simply doesn't belong in the category of 'orders'. This isn't specific to child-adult relations. Whenever someone is in a position to give orders, asking for justification is seen as a challenge. Reasonable or rational people do, of course, ask for and give out reasons for their orders. But this doesn't reinforce authority and obedience. It creates or reinforces cooperation between two people who are more nearly equals, than a giver and a taker of orders. The emotional/social basis for giving orders is precisely "because I say so" - orders to establish dominance and obedience - and having to explain yourself automatically subtracts from your authority.
3LeBleu14y
Your attempt to understand these people's motivations seems to assume that these people understand that you don't know the answer. Another possible motivation is that they think the explanation is obvious or common knowledge, and hence you must be asking to antagonize them, not out of actual ignorance. Not to say that I don't think some people's motivation really is the one you've stated - they simply enjoy being in control of people.
4SilasBarta14y
If you're talking about my complaints about the forum, that's not the case. One time, numerous people asked for clarification from this person about which kinds of behavior that person was asking others to stop, so the person clearly knew it was an issue that the others didn't know exactly which behavior was being criticized. That person eventually resorted to, "I'll tell you when I don't like it, as will a few people I've selected." 18 months later, he/she agreed his/her preferences were not typical. I will provide the documentation privately if you wish, but I have no desire to start this publicly.
0NancyLebovitz14y
I think what got me into it was Psychetypes, a description of the Myers-Briggs types with some rather abstract theory about how they experience time and space differently than each other. [1] Anyway (and this should be a clue about how hard it can be to learn this sort of thing) when I first started reading the book, I got to the bit about there being many sorts of normal, and I put the book down for two years-- it was that hard to get past the idea that either I was crazy, or everyone else was. Anyway, look at how a lot of people talk about taste-- a lot of them really believe that everyone should like and dislike the same things they do. Or people who believe that if some diet/exercise method worked for them, therefore it would work for everyone if they'd just try hard enough. Or that allergies they haven't got must be illusionnary. [1] IIRC, SPs experience the present moment most vividly, NTs imagine time as evenly spaced along a ruler, NFs have vivid experience of past emotional moments, and someone (it's got to be another N, and I can't remember what SJs experience) are most aware of future possibilities. You double all this to get 8 types because some people think spacial boundaries are real and others don't.
7RobinZ14y
Generalizing From One Example. Top rated Less Wrong article of all time, and we see again and again why. :/
6Peter_Lambert-Cole14y
You mentioned Myers-Briggs types and "the idea that either I was crazy, or everyone else was." I think I had a similar experience but with a different analysis of the MBTI classifications. It was Personality Type: An Owner's Manual by Lenore Thomson and there is a wiki discussion here. I found the scientific basis fairly flimsy. She connects the 8 cognitive functions to various regions of the brain - left and right, anterior and posterior - but it seems like a just so story to me. However, I have found it immensely useful as a tool for self-improvement. The main insight I got from it is that while other people are crazy, they are crazy in a fairly well-defined, reproducible way. Other people see things completely differently from you, but it's fairly internally consistent and so you simulate it on your own hardware. There are two ways I think about this: One, your brain is is trying to constantly make sense of all this sensory data that comes in. So it determines that one part is the signal and one part is the noise. It tries to minimize the noise and focus on the signal. But then you realize there is a whole other signal in what you thought was noise and there are people tuning into that and think your signal is actually the noise. If you then turn into that signal, you can understand what other people have been listening to the whole time The other is, we are all playing 8 board games simultaneously, where if we roll the dice our piece moves that amount in each of the games. In order to make sense of this, we focus on one of the games, trying to forget about the others, and try to win this one. But other people are focused on trying to win a different game. So when they try to talk to each other about who is winning, they completely talk past each other. But when you realize that someone thinks he is playing a different game and you figure out what it is, you can have a much more productive conversation/relationship.
2wedrifid14y
This is an important insight. I'll add that sometimes being able to understand the different way people think can simply allow us to realise that it is more productive to have no (or minimal) relationship without judging them to be poor thinkers. Judging them not to be 'thinkers' in your original sense at all can be a lesser judgement than concluding that they suck at it.
2NancyLebovitz14y
Thanks-- that's a lot more use than I've made of the system. Does it make sense to think of yourself as crazy to the same extent that people of other psychetypes are? Links need to be in a system called Markdown rather than the more usual html-- the details for them are at the help link in the lower left corner that shows up when you start writing a reply.
1savageorange14y
If you take crazy to mean 'acting, thinking or feeling in a way disjointed from or opposed to reality - , I'd say it makes a lot of sense to think of yourself as just as crazy as anyone else (and it reduces the incidence of giving your own feelings and thoughts undue importance, IME.)
0torekp14y
Upvoted for giving technical help.
0Peter_Lambert-Cole14y
Fixed. I don't think so. The term captures how radically different the another types are from your own. It's about relative distance between you and others, not an absolute quality.
6NancyLebovitz14y
Risto_Saarelma just posted a prime description of how hard it is to believe that other people mean what they're saying about how they see the world-- a woman who'd spent a long time in the New Age culture describes her conversion to skepticism.
2AdeleneDawner14y
Second link's broken. You may have meant this?
1kpreid14y
Your second link is broken.
1NancyLebovitz14y
Thanks. It's fixed.
0[anonymous]14y
Here's the article in question.
1bentarm14y
I think we should probably be very wary of taking anything based on the Myers Briggs classifications seriously. They seem to be based almost entirely on Forer Effect type predictions and almost impossible to falsify. If I remember correctly, the Big Five tests are slightly more robust (eg, a Big Five profile has fairly high predictive power, and is fairly stable over time).
4Peter_Lambert-Cole14y
I think skeptical people are too quick to say "Forer Effect" when they first do Myers-Briggs. They notice that their type only partially describes them and assume that something fishy is going on. But if you switch all the letters and read the description of the exact opposite type, there is almost nothing that could apply to you. That in itself means that there is some non-trivial classification going on. San Francisco may not be LA, but it sure isn't Moscow.
1NancyLebovitz14y
I don't take the specifics very seriously-- I don't try to analyze everyone in terms of MB-- nor the the Enneagram, which I also find somewhat useful. Occasionally, I find someone who seems to have a very strong tendency towards some of the traits described in a system, but most of what I get out of these systems is a clue that people are very varied, that it's normal for people to be different from each other, and some ideas about possible differences.
1wedrifid14y
NP.

Not to omit the distinct (and surprising) possibility that YOU might be weirder than you think.

Don't ingest words from a poisoned discourse unless you have a concrete reason to think you're immune.

Politics is often poisoned deliberately. Other topics are sometimes poisoned accidentally, by concentrated confusion. Gibberish is toxic; if you bend your mind to make sense of it, your whole mind warps slightly. You see concentrated confusion every time you watch a science fiction show on television; their so-called science is actually made from mad libs. Examples are everywhere; do not assume that there is meaning beneath all confusion.

Here's the exact opposite advice. I wouldn't even bother posting it here except it's from one of the major rationalists of the 20th century:

"In studying a philosopher, the right attitude is neither reverence nor contempt, but first a kind of hypothetical sympathy, until it is possible to know what it feels like to believe in his theories, and only then a revival of the critical attitude, which should resemble, as far as possible, the state of mind of a person abandoning opinions which he has hitherto held.... Two things are to be remembered: that a man whose opinions and theories are worth studying may be presumed to have had some intelligence, but that no man is likely to have arrived at complete and final truth on any subject whatever. When an intelligent man expresses a view which seems to us obviously absurd, we should not attempt to prove that it is somehow true, but we should try to understand how it ever came to seem true. This exercise of historical and psychological imagination at once enlarges the scope of our thinking, and helps us to realize how foolish many of our own cherished prejudices will seem to an age which has a different temper of mind." -- Bertrand Russell, A History of Western Philosophy

2simplicio14y
I think Russell was right that this is a powerful technique, but he was also naive about the heuristics & biases addendum to classical rationalism. So he is recommending a technique that is very useful but also epistemically dangerous.
3ata14y
That is very well put.
[-][anonymous]14y130

The most important thing I learned from this site:

If you suspect something is factually true, don't be afraid to believe it. It can't hurt you.

That's simple. Not easy to implement, but easy to express.

9Vladimir_M14y
SarahC: This is true only assuming that all beliefs that you suspect might be factually true are respectable. Espousing disreputable beliefs -- and sometimes merely being suspected of harboring them -- can hurt you very badly regardless of how good evidence you have for them. Even if you manage to hide your dangerous thoughts perfectly, there is still the problem that duplicity is very unpleasant for most people, if anything because it requires constant caution and self-discipline to watch your mouth. Of course, this is irrelevant if there are absolutely no beliefs that a rational person might suspect to be true and that are at the same time disreputable to the point where expressing them might have bad repercussions. However, that's not what I observe in practice. Speaking as someone who happens to believe that some not very respectable views are factually true, or at least plausible, sometimes I can't help but envy people whose opinions are all respectable enough that they can relax and speak their mind openly in all situations. (I raised the same point on OB a while ago.)
[-][anonymous]14y140

Oh, I have the same thing. I do have some nearly disreputable views, and I have accidentally hurt people's feelings by airing them. (Pretty mild stuff: "Walmart's not so bad" and "Physical resurrection doesn't make sense.") Now I'm pretty much housebroken, although I worry like wedrifid that it shows in my facial expressions.

But. Would any of you really trade being well-informed for the convenience of not having to hold your tongue? I know I wouldn't.

9Vladimir_M14y
SarahC: I'm curious whether you'd extend that principle to arbitrarily extreme hypothetical situations. Imagine the most disreputable factual belief you can think of, and then suppose (for the sake of the argument) that there is in fact some strong evidence in favor of this or some equally disreputable view, which is however ignored or dismissed by all respectable people. Furthermore, suppose that if you find out about it and update your beliefs accordingly, this knowledge will not give you any practical benefit, but merely place you in a situation where your honest beliefs are closer to truth, yet extremely disreputable. Mind you, we're not talking about your views merely causing some irritation or provoking heated arguments. We're talking about a situation where in most social and all professional situations, you are unable to look at people's faces without thinking that they would consider you an abominable monster unfit for civilized society if they knew your true honest thoughts. You have to live with the fact that people around you (except perhaps for a few close friends and confidants) respect you and are willing to work and socialize with you only insofar as they are misled about what you really believe and what you truly are. Would you really prefer this outcome to staying blissfully ignorant?
3Leonhart14y
Well, yes. You mean you don't want to secretly have a powerful and dangerous dark side?
2[anonymous]14y
Probably not. A sensible person ought to be willing to suffer for a few very important things... but very few. So a very disreputable belief ought to also, in some way, also be very important to be worth believing. In practice, when a contentious issue also seems not very important (or not very relevant to me) I don't bother investigating it much -- it's not worth becoming disreputable for.

Knowing whether disreputable beliefs are true is helpful in figuring out what intellectual institutions you can trust.

7Vladimir_M14y
This, however, means that your above comment is in need of some strong disclaimers. Unless of course it's directed at someone who lives in a society in which all highly disreputable beliefs happen to be false and outright implausible from an unbiased perspective. (But would you bet that this is the case for any realistic human society?)
0[anonymous]14y
I absolutely prefer that outcome. Aren't we all used to having to censor ourselves in all kinds of surroundings?
7Eliezer Yudkowsky14y
Well said. That's a 5-second response right there to quite a lot of people in the econoblogging community who think they're clever.

On a related note, I think too few people realize that it's OK to sometimes hold beliefs that are mistaken in a strongly disreputable direction. If all your errors fall on the reputable side of the line, you're missing out on accuracy. In a noisy world, sufficiently asymmetric suppression of falsehoods is indistinguishable from suppression of truths.

4Will_Newsome14y
Twitter-worthy!
0Will_Newsome13y
^necrobump
0SilasBarta14y
Is there a name for this theorem? It seems like it follows from invariance of information content (passed through a noisy channel) under permutation of symbols.
7wedrifid14y
I agree and add that watching your mouth is not nearly enough. I, for example, are extremely good at watching my mouth when I am attempting to tow absurd party lines but I am irredeemably poor at controlling all the minute details of body language that must go with it. The only reliable way for most people to tell lies with adequate sincerity is to lie to themselves first.
1khafra14y
Also, ignorance is bliss.
0wedrifid14y
Awesome. Do tell! Allow me to join you in controversy.
8apophenia14y
This is the Litany of Gendlin
2satt14y
This is technically true, in the sense that belief won't hurt me in and of itself. But beliefs inform our actions, and once the two are connected, beliefs acquire causal power to hurt me.
5Oligopsony14y
Also, we have a bias against overturning beliefs. I think the folk epistemology implied in the distinction between words like "suspect," "think," "feel," "believe," and "know" is, on the whole, fairly useful. You can flatten them all into the word "believe" but you lose something. The dogma here is also to assign probabilities to your beliefs - the zoo of belief-verbs is just a cognitively cheap way of doing so.

Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle

I like to reply to such accounts with "Luckier not to have been in the crash in the first place."

2xamdam14y
Totally. Theists love this error.

Whenever you're uncertain about an issue where bias might play a role, ask yourself honestly what you would say if you knew that if you gave the wrong answer, rabid leopards would storm into the room and eat you.

3Spurlock14y
It's too bad this probably can't be used effectively in argument. If you ask a theist whether God exists and add the leopard clause, he'll probably say "absolutely" and then use the lack of resultant leopards as evidence. Still, for someone already interested in rationality looking only to correct himself, feels like a strikingly powerful technique.
2Unknowns14y
I remember reading in the news that one of those crazy guys who went into a school with a gun and started shooting people asked a few of them, "Does God exist," threatening to shoot them if they said yes. Some of them said yes anyway (and he shot them), so it looks like this method isn't going to stop people from believing in God.
9khafra14y
The idea of a physical threat is the same; but the social context is radically different--the school shooters were asking their victims to give up their tribal allegience under circumstances similar to historical threats. Ideally, this question would be posed under conditions which divorced epistemic state from tribal identity as much as possible.
2RobinZ14y
That could be an urban legend - it's the sort of story that martyr-happy adherents would be likely to fabricate and spread, and the story I find searching online (Columbine) only has one person asked that question, and after being shot.
2Unknowns14y
No, I read it at the time of the event, in the regular news, although I don't remember the details well enough now to find it again.
5RobinZ14y
That could be Columbine. In an earlier Salon article talking about the investigators preparing their report: ...and the article I linked previously followed up with this: and this:
1simplicio14y
Upping the emotional ante sometimes works. "What if your daughter's life was at stake?" Kind of a cheap tactic though.
3khafra14y
This is going to sound silly, but I've had some success with wild swings in emotional ante. "What if you and your family were going to be abducted and tortured, and then have your skin sewn into suits by a psycho if you got it wrong... Ok, well, what if I had a really tasty-looking donut that I were going to give you if you got it right?" It's a bit disingenuous, since I get the emotional impact of seriously proposing something untoward and then get to say "just kidding;" but if it's worth it; take a walk on the dark side.

No method of explanation should be considered good unless it's been tested on a number of ordinary people.

My impression is that the best way to explain Goodhart's Law is to bring up employee incentive plans which don't have the effect the employer was hoping for.

Candidate 1: "If a trillion trillion trillion people each flip a hundred coins, someone's going to get all heads." ("If a trillion people each flip a billion coins" might be a stronger meme, though extremely inaccurate.)

Candidate 2: "Knowing the right answer is better than being the first to argue for it."

Candidate 3: "If it moves, you can test it."

2sketerpot14y
Those are catchy! Of course none of those is an explanation that most people can use -- the inferential distance is pretty big -- but they'd make great sound-bite segues to a slightly longer explanation. If you just hit someone with a zinger like that, it'll feel to them that you're just scoring points, and they might get annoyed; but if you use it as the start of a discussion, that's likely to be perceived as more respectful.
1wedrifid14y
I like 1 and 3 but I'm dubious about 2. I am not convinced that it is true in the case of most humans. I'd like it to be but most people live sufficiently in a social reality that actually being right is not all that important.
2NihilCredo14y
"Learning the right answer is better than having come up with the wrong one"?

I'd like to come up with something meme-sized about curiosity stoppers. How about:

When you pretend to know an answer, you're wrong twice.

It doesn't get the subtleties across, but it might be enough to gain a foothold in the average person's mind.

You might be underestimating just how much curiosity-stoppers feel like actually knowing an answer. I still catch myself reading Wikipedia articles just up to the point where they confirm what I thought. Your meme would have to imply just how difficult it is to notice this in yourself.

1b1shop14y
I imagine the best a meme can do in this case is convince the host it's wrong to succumb to the bias. That'll lay the ground for change in the future.

As I think has been mentioned in this thread by others, but bears repeating, if you want to convince people that they are affected by cognitive biases sometimes you have to really hit them over the head with them.

I've found the examples in Hindsight Devalues Science and the basketball video (I don't imagine there are many people here who haven't seen it, but I'll not post a spoiler just in case). Are particularly effective at this. I guess calibration tests would also be good on this metric. Once you've pointed out a "cortical illusion" like this... (read more)

8Paul Crowley14y
People will often admit that they'll walk across the road to save $10 on the cost of a $20 memory card but not a $2000 plasma TV.
1Alicorn14y
People who are spending $2000 probably value their time more highly than people who are spending $10-20, ceteris paribus. It might be less expensive for the second buyer to cross the street. (Even if it's the same person on a different day or in a different frame of mind.)
2datadataeverywhere14y
That's exactly what doesn't make sense; asking the same people whether they'd walk across the street to save money on X should depend on how much they value their time, not on how much they value X. It isn't rational for there to be states of mind where buying more expensive things makes people value their time more when the rest of the environment is identical.

Here's something that comes up in many, many discussions of climate change and anything else where a lot of arguments come from models or simulations: sometimes you have to do the math to make a valid (counter-)argument.

Example:

A: ...And so you, see, as CO2 increases, the mean global temperature will also increase.

B: That's bullshit, and here's why: as CO2 increases, there will be more photosynthesis -- and the increased plant growth will consume all that extra CO2.

Another example (the one that motivated this comment):

A: And so, as long as the bus is carry... (read more)

4PhilGoetz14y
I believe that in the first example, "A" is supposed to be right. In the second example, is "A" or "B" supposed to be right? B is doing the math, but assumes that fuel required is proportional to mass, which is wrong, due at least to engine size and air resistance. (Consider the (mass x miles)/gallon of a 2005 RST1000 Futura motorcycle (565 x 42 = 23730), a Smart car (1808lb x 40mpg = 72320), a 2010 Honda Civic DX-VP (2709 x 36 = 97524), and a 2010 Toyota Camry SE (3329 x 33 = 109857). All MPG are EPA highway estimates.) By default, I expect your examples to take the same form (e.g., the counterargument is right in both cases, or wrong in both cases). Deviations from that pattern should be pointed out. Cases where doing math does not qualify as "doing the math" due to incompleteness should be pointed out. (BTW, reminds me that Brad Templeton, founder of rec.humor.funny and the Oracle, gave a talk at the 2009 Singularity Summit in which he showed data claiming that mass transit typically has the same fuel efficiency as a car with 1.5-3 people in it. Because a mass transit trip (at least the ones I take) usually require you to travel a longer distance than you would by car, mass transit loses to one person in a fuel-efficient car for fuel efficiency. And the cost of mass transit is much higher per person-mile; and the time taken is about double (in the DC metro area). These facts combined suggest that mass transit is neutral or bad for the environment, bad for the passenger, and bad for the economy.)
2TedW14y
I'd meant A to be right in both cases. And of course -- against my own remonstration -- I did none of the math myself. I was unfamiliar with the Templeton data. I looked it up, and it's interesting. I'd note that while Templeton agrees that transit (by the system, not by the fully utilized vehicle) is less efficient than fuel-efficient personal transportation, he still thinks people should make use of existing transit systems. I ride a bike.
[-][anonymous]14y70

Think about your judgments of confidence in terms of frequencies instead of probabilities - our frequency intuitions tend to be much closer to reality. If you estimate that you're 90% sure of something, ask "if I faced ten similar problems, would I really get nine of them right?"

Counter-argument:

"Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique."

Possible alternative angle of attack - get people to read longer articles. Promote things that increase attention span, for example. Admittedly, you then... (read more)

6sark14y
Increased attention span is certainly a good thing to have. But one must be wary of insisting on a difficult path to enlightenment. Morality can sometimes be more about how moral you are and how the rest of the world are not, than about actually doing good. If it is about signaling how good your are, then a costly signal would be preferable, since it is hard for infidels to fake. But if you want good things to happen, then you should strive toward making good acts as easy to accomplish as possible. Short attention span seems to be a general problem, but it is not a general problem of which irrationality is a special case. The case here is distinct from the case of religion and raising the sanity waterline. We might want to solve the problem of having a short attention span, but let us not pretend that this will automatically solve, or even simply be the deciding factor in solving, the problem of irrationality
6mstevens14y
There's room for debate here in my book, but my argument is: * rational arguments are often complicated and require attention to detail. * Many people have problems with complicated arguments that require attention to detail. We can try to deal with this in two ways: * Making the arguments simpler. * Dealing with the problem of people not following detailed arguments (thus my earlier comment) I think both look like promising lines of attack. It is, of course, always desirable to keep arguments as simple as possible.
3sark14y
I think what we should do is to try getting a foot in the door. We want to intrigue people enough such that they will seek further knowledge of rationality. People have the capacity for attention if they want something badly enough.
0mstevens14y
These people will not yet be very rational (by definition of target audience). Therefore they are likely to judge arguments on emotional grounds. So I suggest that we need to find short arguments that promote rationality, but make an essentially emotional case for it. Ideally one would find something that overlaps - it persuades at both the emotional and rational levels.

I think "unknown unknowns" is a good one for this sort of thing. My attempt follows:

We know a lot of things, and generally we know that we know them. These are "known knowns." I know that 1+1 = 2, I know that the year is 2010, and so on.

We also don't know a lot of things, but generally we know that we don't know them-- for example, I don't know the hundredth digit of pi, I don't know how to speak Chinese, and I don't know what stocks are going to do well next year. All of those things are "known unknowns," or unanswered questi... (read more)

3RHollerith14y
Japan had a research program into nuclear weapons, but they ran into what they considered an insurmountable hurdle, which they believed would stop the US, too. Something to do with the lack of industrial capacity (electricity??) needed to produce enough fissionable material if memory serves.
4gwern14y
If memory serves, both the Japanese and Germany nuclear weapons program made a subtle mistake with the cross-section of uranium atoms (or something like that), and wound up calculating that critical mass would be something like a ton of enriched uranium, and so not a useful weapon within WWII's timeframe. (I read about this while also reading Copenhagen, but I can't remember what book. IIRC, Heisenberg claimed he had made this mistake deliberately and this was evidence that he wasn't cooperating whole-heartedly with the Nazis, but the countercharge is that he seemed as astounded as the rest of the German physicists in custody when told of Hiroshima & Nagasaki.)
1Douglas_Knight14y
Is a duplicated error evidence for or against sabotage? Heisenberg did not claim to have sabotaged it. Wikipedia claims that the story comes from selective quotation of the last letter here. But, when the bomb was announced, the imprisoned Heisenberg's reaction of frantic work is suspicious to me: it suggests that he knew where the mistake was and wanted to go back and do the work he had blocked (but I don't know the details; maybe he was working on something independent of the mistake).
1gwern14y
Well, for 2 physicists of equal competency, differing results would suggest sabotage, since for both to give the wrong answer suggests that either they are not good enough to get the right answer at all, or they both got the answer and simultaneously decided to sabotage. Heisenberg was great, though, surely greater than anyone on the Japanese project; so I tend to regard the net as a wash, and focus more on Heisenberg's reaction - which as I said suggests he genuinely made a mistake and was not engaged in passive resistance, and his surprise & flurry of activity was a give-away. No numbers, unfortunately. But I did notice: Of course, for a few kilograms of enriched uranium or plutonium, you don't really need huge reactors running for years and years - the hard part is enrichment. Yesterday I was reading a history of modern Korea, and North Korea obtained enough plutonium for a bomb or 3 by running a 20 or 50 megawatt reactor for 2 or 3 years, IIRC. But perhaps by Heisenberg's 1940s standards such a reactor is beyond huge.
3NancyLebovitz14y
(Factual correction) The US didn't have nuclear weapons when Japan started the war. {Mulling the topic) Not only that, but I think "the other side won't come up with a superweapon" is generally the way to bet, though perhaps less so than it used to be. I thought radar was invented for WWII, but it's not that simple. Maybe I've missed something, but I don't think there's been anything but incremental improvement in war tech since WWII-- nothing really surprising.
1thomblake14y
It's close enough - as that page notes, what we know as RADAR was developed during the war. That's also when Norbert Wiener developed the first radar-integrated guns. It really depends what you call "incremental", and what sorts of increments you're looking at. We have robots with guns!
1NancyLebovitz14y
If the standard is nukes and radar, then only things which leave the other side saying "how is that even possible?" or "that came out of nowhere" counts as surprising. Robot drones are not surprising. I'm pretty sure invisibility tech would not be surprising. Anti-gravity would be surprising.
3Drahflow14y
Decreasing frequency of surprising technology advancements are caused by faster and more frequent information of the general public about scientific advancements. If the rate of news consumes grows faster than the rate of innovations produced, the perceived magnitude of innovation per news will go down.
1LucasSloan14y
How many people, even as smart as us, correctly predicted the sorts of wonder weapons that the intense research pressures that a world war would create in say, 1935? If we're talking about surprising sorts of weapons, I expect not to have been exposed to them, or if I have, to have rejected them out of hand.
1katydee14y
It is difficult for me to conceive of military technology that is: a) potentially surprising b) powerful enough to make a big difference c) near-future "Rods from God" might count, if they exist, but they're not surprising. The best example I can think of is strong memetic warfare, but I'm not confident that will be developed in the near future (or indeed ever).

If I only have a few minutes, I tell people to study cognitive bias, in the hope that surely any intelligent person can see that understanding what science has to say about the systematic, predictable failings of our own brains can hardly fail to be useful. You need long enough to impart the caution that you have to apply these things to yourself, not just to other people...

5RobinZ14y
I agree, and I think Yudkowsky's suggestions in Knowing About Biases Can Hurt People is appropriate, here:
2sark14y
Yes, but before people would go and study cognitive bias, they have to be convinced that it exists in the first place! Most people are not already familiar with the idea that our minds systematically fail us. I think the best way to introduce the idea would to present a striking case of bias (pervasiveness+impact). Then letting them know that there are many many others.
2Eliezer Yudkowsky14y
I use the conjunction fallacy for my first illustration.
2TedW14y
Seems to me that all that would do is reinforce someone's opinion that probability theory is irrelevant to the real world. I personally would start with confirmation bias, partly because there are lots of clear examples in pop culture. Like: last night I was watching a rerun of "Glee." Will Schuester, a teacher and the glee-club advisor, is trying to quash a student's crush. He sings her (Rachel) a medley of songs in which the singer is trying to deflect a much younger woman's advances. (Actually, both songs -- "Don't Stand So Close to Me" and "Young Girl" -- are actually about the singer unsuccessfully trying to resist the temptation of the younger woman, but in the episode the lyrics are changed and edited so that they ostensibly work.) So he sings, and the whole time Rachel is clearly hearing the opposite of the intended message. After the song, Will asks Rachel what his message was, and she says, almost giddily, that his message was clear: "I'm very young and it's hard for you to stand close to me."

*Candidate 1 requires intuitive understanding of probability, fat chance.

*Candidate 2 would require a rewiring of humans about the system how status is perceived.

*Candidate 3 just does not work. Talk with people about the image they have what scientists do all day. It is bad. Especially when you go to New Agers.

Maybe you assume that people have a consistent world view, or at least the desire to have one, but no. Please try the proposals on real people, and report back. I expect you to run into the problem that objective truth is widely not accepted, and ... (read more)

8sketerpot14y
I know it's possible, since I've rewired myself in this way, and it wasn't particularly difficult. Am I really that weird? You don't have to use the word "science". As Darmani put it, "If it moves, you can test it." Follow up with an explanation of why a particular claim is testable, and how to test it. For example, if someone claims that he can tell the difference between an empty water jug and a full water jug with a dowsing rod, then it's easy enough to test it. I've used this exact approach on quite a few people, and it seems to do a pretty good job of banishing their claims that whatever we're arguing about is untestable. I wouldn't bet on them generalizing this lesson, though. The sticking point here is "good". Most people settle for really crappy counterarguments, including straw-man counterarguments concocted by people who agree with them.
1sark14y
It's not really about rewiring yourself. Your status depends on how others perceive you. The easiest way to have truth instead of winning arguments as conferring higher status, is to move to a community with such norms, such as LessWrong. But we are trying to convince the general public of rationality here. So until most people have peers who already value truth over winning arguments, Candidate 2 will face significant challenges.
2DuncanS14y
*Candidate 1 requires intuitive understanding of probability, fat chance. I agree that an intuitive understanding of probability isn't likely to happen. But what you can do is train yourself to recognise at least some of the situations where your intuitive system is going to mess it up. Hopefully next time you see something and think "What a fantastic coincidence!", your next thought will be "Nice, but remember all the other fantastic coincidences that might have happened and didn't." instead of "My life is so improbable it must have been orchestrated by some unseen force."

My idea would be to give a truncated version of a point made in Truly Part of You.

The different sound-bite ways to say it are:

  • True knowledge regenerates.
  • Only believe something once you recognize how you would learn it some other way.
  • Your beliefs should be unaffected by your choice of labels.

Low inferential distance explanation: When learning about something, the most important thing is to notice what you've been told. Not understand, but notice: what kinds of things would you expect to see if you believed these claims, versus if you did not? Are yo... (read more)

"Wisdom is like a tree. Cut away the pages of predictions, the branches, even the roots, but from a single seed the whole structure can be rebuilt.

Foolishness is like a heap of stones. Stack them up however you please, paint bright colors to catch the eye, call them ancient or sacred or mysterious, and yet a child could scatter them."

3SilasBarta14y
Very well said! Is that your own phrasing?
7Strange714y
It is. If I were to make a top-level post on how to rephrase truthful things to sound like mysticism or poetry, how many times do you think it would be downvoted?
7Eliezer Yudkowsky14y
People seemed to like Twelve Virtues of Rationality and Harry Potter and the Methods of Rationality.
4Strange714y
Yes, but those are polished outputs, and (no offense) have your halo-effect to back them up. I'm talking about sketching in a more generalized algorithm which accepts highly technical explanations as input, and produces output which a member of the general public would intuitively recognize as 'wise,' while retaining the input's truth-value.
8Eliezer Yudkowsky14y
There are algorithms for that? My brain just does it automatically on request. (Also, I presented HPMOR to a new audience with my name stripped off just to check if people still liked what I wrote without the halo effect.)
0Strange714y
Of course there are algorithms. The question is whether they have been adequately documented yet.
5SilasBarta14y
It's not the poetry that's the problem, it's the mysticism. Your quote sounds like the former, not the latter. Or maybe "ancient wisdom" is the right term to describe what your version sounds like -- but the point is, it tells people why to think some way, and if they endorse it, they endorse a good truth-seeking procedure for the right reason, which is the important part.
2SilasBarta14y
By the way, I had googled "wisdom is like a tree" before asking you, and it didn't seem to turn up any existing quotations. It surprised me that no one had famously compared wisdom to a tree -- not in a positive sense, anyway. It's a good analogy, and -- if you're into that kind of thing -- you can extend it even further: trees (can) yield fruit, the seed stays dormant if it's not in an environment that lets it grow, all the seeds take a similar path when expanding ...
1Strange713y
That's only a negative sense if you're working with the assumption that the biblical God is a good guy, an assumption which (given the sheer volume of genocide He committed personally, though His direct subordinates, or demanded of His human followers) simply does not hold up to scrutiny for any widely-accepted modern standard of 'good.' I mean, look at Genesis 3:22 if nothing else.
2simplicio14y
I say do it. Literary style is a huge obstacle to the dissemination of skepticism.
2wedrifid14y
-13. (Well, actually I estimate 18 upvotes and 5 downvotes leaving effectively -13 downvotes).
1thomblake14y
You claim to be good at explaining things. If you have time, you should take a crack at some more short explanations of things.
1SilasBarta14y
I agree. I'm taking suggestions for notoriously difficult rationalist concepts (including information-theoretic ones) that are regarded as difficult to explain, or as having a high inferential distance. I'm working on some articles related to that, but I'd be more interested in what topics others think I should try explaining better than standard accounts.

Show someone the gorilla video, or another of the inattentional blindness tests.

Telling someone their brain is a collection of hacks and kludges is one thing; showing them, having them experience it, is on another level altogether.

Relatedly, my favorite quote from Egan's Permutation City: "You have to let me show you exactly what you are."

Another classic example of the brain's hackishness, which does not seem to have been mentioned here before, is the sentence, "More people have been to Russia than I have." If you say this sentence to someone (try it!), they'll at first claim that it was a perfectly reasonable, grammatical sentence. But then you ask them what it means, they'll start to say something, then stop, look confused, and laugh.

(Yes, there is a parsing of "have" as "possess", but this is (a) precluded by inflection, and (b) not ever what someone initially comes up with).

3bentarm14y
"More people have been to Russia than I have." Does this test not work when written down? Or am I unusual? The sentence jarred immediately on the first reading, and I went back and read it about three times to try and figure out if it could have any meaning at all before carrying on to the rest of the paragraph.
1novalis14y
I have never before attempted to transmit it in writing, and I'm not a linguist. But apparently, it works for at least somewhat for at least some people (see Oscar_Cunningham below). Still, I'm sorry to have spoiled for you the effect of hearing it.
1wedrifid14y
Same experience here. I read it through a few times to whether if it was ungrammatical or just weird. I got a feeling of mental reward when my confusion dissolved and the actual possible meaning clicked. It would take a particular kind of brain for someone to phase a sentence that way.
1Eliezer Yudkowsky14y
Ooh, same embedded system crasher as "I couldn't fail to disagree with you less."
3roryokane14y
I don’t see how that is a system crasher sentence. I think I can successfully parse it as “I must succeed in agreeing with you more”. Yes, it takes a while to figure out the meaning because turning each negative into a positive is a separate step, but there is a meaning in the end, unlike the sentence about Russia.
1komponisto14y
You picked a particularly bad context in which to confuse inflection with intonation (one of my greatest pet peeves).
2wedrifid14y
Wow. That difference is new to me. Thanks, I'll remember that!
0Vladimir_M14y
If we're going to be really precise, wouldn't the difference here be a matter of grammatical stress rather than intonation?
0komponisto14y
"Grammatical stress" isn't a technical term, as far as I know. In any event, the phenomenon we're discussing here is the grammatical function of a word being communicated by the intonation pattern (as well as, probably, the speed pattern) of the sentence in which the word occurs.
0Vladimir_M14y
komponisto: I am not a linguist, but I've see the term "grammatical stress" used to denote situations where the stress of a word is determined by its syntactic context, and where a difference in stress may imply a different syntactic structure of the sentence. This is in contrast to lexical stress, which is a context-independent property of each word, and intonation, whose variation doesn't affect the syntactic structure, but merely changes things at the level of pragmatics. Now that I've googled around a bit, I see that these terms aren't really standardized, and authors who use them typically make sure to include their favored definitions to avoid confusion. If you use "intonation" also for what I call "grammatical stress" above, then fair enough. (And for all I know, such usage might indeed be more common.) Still, I think the contrast I have in mind is worth pointing out. In the above example, the difference in stress implies a different syntactic structure -- "have" can either be a complete verb phrase, or just an auxiliary verb referring to an antecedent (i.e. a verb phrase ellipsis). This is different from situations where changing intonation affects only pragmatics.
0komponisto14y
I'm not sure it's a good idea to restrict the use of "intonation" to describing pitch patterns that don't convey syntactic information. I suppose if one did that, one would have to simply say "pitch" for what we are talking about here, unless there's another term available.
0Vladimir_M14y
Come to think of it, you're right. It make sense to define "intonation" in purely phonetic terms (i.e. as pitch variation), and in that sense, it's certainly present here. It is possible that I got a mistaken idea about the common technical meaning of this term in my amateurish forays into these subjects.
0novalis14y
I meant inflection: "Alteration in pitch or tone of the voice." But to avoid confusion in the future, I will try to use the linguist's definitions of these words, since they're more precise. Also, the Wikipedia article suggests that tone rather than intonation might actually be the correct word, since there is a semantic difference.
3komponisto14y
Thank you. No; "tone" refers to a phenomenon in certain languages (most famously Chinese) wherein otherwise identical words are distinguished from each other -- in isolation, nothing to do with their placement in a sentence -- by the contour of one's voice when pronouncing them. The kind of contextual variation of pitch that you are talking about -- intonation -- is pretty much universal to human speech in all languages.
-3novalis14y
Wikipedia says: In this case, "have" is the auxillary verb, rather than the ordinary verb "to posess", and you can tell that by the intonation. That's otherwise identical words distinguished from each other.
2komponisto14y
Sorry if this sounds a bit harsh, but I'm puzzled by this reply. It's as if you stopped reading my comment immediately after the phrase "otherwise identical words distinguished from each other", and ignored the next part, which happened to be the most important part. So let me try again, using bold for emphasis: Did you actually read the Wikipedia article that you cited? Here's an example it gives from Chinese: 1. mā "mother" 2. má "hemp" 3. mǎ "horse" 4. mà "scold" 5. ma (an interrogative particle) This should have made it clear that we're talking about a different phenomenon from anything that occurs in standard varieties of English. In Chinese, the intonation pattern of an individual word is actually lexical -- it's a fixed property of the word that applies even when the word is pronounced in isolation, entirely like the pattern of consonant and vowel sounds in the word. The five Chinese words above are not homophones, unlike "have" ("possess") and "have" (auxiliary) in English. The two senses of English "have" can't be distinguished when the word is pronounced by itself.
1Oscar_Cunningham14y
Wow, it took me a long while to realise what was wrong with that sentence.

Anti-candidate: "Just because something feels good doesn't make it true."

The Litany of Tarski and Litany of Gendlin are better ways to approach this concept, because they're both inexorably first-person statements.

Candidate: Don't pursue an idea unless it came to your attention by a method that actually finds good ideas. (Paraphrased from here.)

3Violet14y
I actually keep getting good ideas in some areas while sleeping. E.g. when facing a difficult problem in programming sleeping a night seems to give the solution quite often.
2VNKKET14y
You changed my mind. I'm worried my candidate will hurt more than it helps because people will conflate "bad idea generators" with "disreputable idea generators" -- they might think, "that idea came to me in my sleep, so I guess that means I'm supposed to ignore it." A partially-fixed candidate: If an idea was generated by a clearly bad method, the idea is probably bad.
2djcb14y
Well, then that is, in fact, a method that finds good ideas! I sometimes use the debonoesque 'lateral thinking' tricks like association with a random dictionary word to come with some creative solution for a problem. I does not work for all classes of problems, but it can be useful. There are some methods that consistently do not work well for me when trying to find good ideas / solutions; for example, sitting at my desk and looking at the screen.

"Let's see how we can check this" or "let's see how we can test this" seems to work in the short run to get people to check or test things. I don't know if it changes habits.

0[anonymous]14y
Agreed. It would be great for people to get into the habit of continually asking "what would this claim imply that I can check?", since not enough people are accustomed to thinking that way.

RE: Candidate 1

For those interested, here's the math:

  • A one in N chance event will not occur with probability 1-1/N.
  • It will not occur after 2 trials with probability (1-1/N)^2.
  • It will occur at least once after 2 trials with probability 1-(1-1/N)^2
  • It will occur at least once after k trials with probability 1-(1-1/N)^k.
  • For an even chance for it to occur at least once, how many trials do we need?
  • We solve for k in this equation:
  • 1-(1-1/N)^k = 0.5
  • (1-1/N)^k = 0.5
  • taking logs of both sides
  • k = ln 0.5/ln (1-1/N)
  • dividing by N
  • k/N = ln 0.5 / ln(1-1/N)^N
  • (1-1
... (read more)
0simplicio14y
That's a useful number to know, thanks!
0sark14y
You're welcome!

Count to ten.

Learning one was wrong (and updating) is a good thing.

One should be more interested in obtaining information than winning debates.

To clarify a little on candidate 1. People are often impressed by a coincidental or unlikely happening, and think that it's some kind of miracle. But in fact there are a lot of individually very unlikely things happening all the time. Out of all the cars in the world, what's the chance that you happen to see three of them in a particular order going down a particular street? Not that high, but obviously cars have to pass you in some order or other.

So all unlikely events can be categorised into unnoticable ones (any three cars at random), and noticable ones... (read more)

"one in a million chances happen a thousand times a day in China" is a bumper sticker phrase for that one I've found useful.

On my own, I've tried out the ol' medical test base rate fallacy explanation on a few people. My dad got it right away; so did one friend; another didn't seem to fully grok it within ~2 minutes of explanation. I haven't done any follow-ups to see if they've been able to retain and use the concept.

(I definitely should have thought of this earlier; interestingly enough it was this comment that was the trigger.)

Use probabilities! (Or likelihood ratios.) Especially when arguing. Yes, do so with care, i.e. without deceiving yourself into thinking you're better calibrated than you are -- but hiding the fact that you're not perfectly calibrated doesn't make your calibration any better. You brain is still making the same mistakes whether you choose to make them verbally explicit or not. So instead of reacting with indignation when someone disagrees, just a... (read more)

3Eliezer Yudkowsky14y
This makes perfect sense to me. I feel far more comfortable converting my sense of credibility to an intensity scale of 1 to 100 than converting those intensities to probabilities.
1Paul Crowley14y
Have you considered using decibans for this purpose?

Terry Pratchett has a good metaphor for a good way of thinking in his Tiffany Aching books. Second, third, etc thoughts. Basically the idea that you shouldn't just trust whatever your thoughts say, you have your second thoughts monitoring them. And then you have your third thoughts monitoring that. I've always found it extremely helpful to pay attention to what I'm thinking; many times I've noticed a thought slipping past that is very obviously wrong and picked myseld up on it. A few times I've even agreed with the original thought upon further analysis, b... (read more)

I think movements grow at their margins. It is there that we can make the greatest impact. So perhaps we should focus on recent converts to rationality. Figure out how they converted, what factors were involved, how the transition could have been easier, taking into account their personality etc.

This is what I have been trying to do with the people I introduce rationality to and who are somewhat receptive. It is not only a victory that they began to accept rationality. It was also an opportunity to observe how best to foster more of such conversions.

It is ... (read more)

6Bongo14y
Defining rationalists as LW users, I think more came from these... * People who followed the sequences while Eliezer was still posting them * People who follow the Methods of Rationality fanfic ...than from just happening upon the site. I think people are more drawn in by an ongoing serial than an archive of pre-existing material. It's easy to get someone to follow a cool blog or fanfic. It's hard to get someone to "read the sequences". Maybe Eliezer should repost his sequences over the next few years, in a foreign part of the blogosphere, under a pen-name? :)
1JGWeissman14y
When I first found OB, Eliezer was just finishing the sequences and transitioning to LW. I would start reading an article, and follow all the links back to articles I hadn't read yet. I was happy to spend days reading a later article with lots of prereqs. For me, have a depth of existing material that has been built on is a feature.
0DSimon14y
Yep, just like TV Tropes or Wikipedia; all it takes is an interesting initial hook, and then the tab-queueing begins.
5Halceon14y
If we use LW as a metric of conversion, then you can consider me a new convert, lured here by the occasional link from the Octagon. This is, of course, a pretty weak metric. I've been interested in rational thinking since the 9th grade, when i went to a debate club and realised that people went there to win arguments, not get to the truth. While i've done my best to keep my actions and words rational in cases that seem detached from my personal life, i think i mostly fail at self-examination. My personal observations confirm that the geek/nerd social group is the most prone to rationality, but there is a significant buffer layer around the group, that can be influenced and converted. P.s., It feels good to finally register here. And... Am i the only one who feels a bit odd when using the word "convert" in this context?
2thomblake14y
Welcome to Less Wrong!

I've heard Candidate 1 expressed as "A one-in-a-million shot happens a thousand times a day in China."

Candidate 2 could be "I like to be less wrong."

Candidate 3 maybe "If it affects reality, it is real."?

That Candidate 2 (admitting that one is wrong is a win for an argument), is one of my oldest bits of helpful knowledge.

If one admits that one is wrong, one instantly ceases to be wrong (or at lest ceases to be wrong in the way that one was wrong. It could still be the case that the other person in an argument is also wrong, but for the purposes of this point, we are assuming that they are "correct"), because one is then in possession of more accurate (i.e. "right") information/knowledge.

http://rejectiontherapy.com/ - the 30 day rejection challenge seems to fit here. Try and, for 30 consecutive days, provoke genuine rejections or denials of reasonable requests, as part of your regular activities, at the rate of one per day.

Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.

That's saying it's surprising that nobody lives to the age of 150. Miracle cancer cures are statistical outliers and it would be interest to know the mechanism that allows them to happen.

This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured.

It's no contraction if you believe in a clever god that doesn't want that the effect gets scientifically measured.

3PhilGoetz14y
But then the believer can't claim God can do things that could be scientifically measured - for instance, curing people who pray more often than people who don't, at least while a scientist is watching. Believers who want to pray for their health should use timeless decision theory to figure out what conditions to meet so that God is allowed to cure them without making that observable to later scientists. Cult startup, anyone?
2anon89514y
A clever god applying its cleverness to the job of making itself invisible is going to succeed.

I'm surprised there aren't any comments about reminding people they can't have it both ways. I haven't found a great way to do it quickly, but I have sometimes talked people down from forming a negative opinion (of a person, group, or event) by asking them if they would have gotten the same perception from a counterfactual (and in some sense opposite) event occurring instead.

1NancyLebovitz14y
I need an example of that one.
5beriukay14y
Ok, one fairly frustrating occurrence in my life is when my girlfriend gets freaked out about failing a math class. The problem being that she gets about as freaked when she does well on a test as when she does poorly on a quiz. Pointing out that she seems to just want to panic regardless of the event seems to calm her more than most of my other approaches. But the example I was actually thinking about when I wrote that involved a coworker talking badly of someone else in the workplace. The specifics are lost to me, but at the time, I noticed that the complaining guy would have had material to gripe about regardless of what the other person did. I mentioned this, and he conceded the fact and changed the subject.
2NancyLebovitz14y
Thanks for the examples. Interesting. I think think that exact phrasing wouldn't work well with me because when I have bad emotional habits, it generally doesn't seem as though I want them. I'd do better with a more neutral phrasing like "it seems as though you panic no matter what happens". All I can do is guess about the difference-- maybe your girlfriend experiences her internal state as wanting the emotions she's getting?
1beriukay14y
You know, I didn't really notice that distinction before. I shall have to pay attention to that. I'll let you know if/how much better that works.

Candidate: Hold off on proposing solutions.

This article is way more useful than the slogan alone, and it's short enough to read in five minutes.

'Instinct,' 'intuition,' 'gut feeling,' etc. are all close synonyms for 'best guess.' That's why they tend to be the weakest links in an argument-- they're just guesses, and guesses are often wrong. Guessing is useful for brainstorming, but if you really believe something, you should have more concrete evidence than a guess. And the more you base a belief on guesses, the more likely that belief is to be wrong.

Substantiate your guesses with empirical evidence. Start with a guess, but end with a test.

2thomblake14y
I disagree with this one. If it's really your best guess, it should be the result of all of the information you have to muster. And so either each of "instinct", "intuition", "gut feeling", etc. are your best chance of being right, or they're not close synonyms for "best guess".
1Sideways14y
I agree (see, e.g., The Second Law of Thermodynamics, and Engines of Cognition for why this is the case). Unfortunately, I see this as a key inferential gap between people who are and aren't trained in rationality. The problem is that many people-- dare I say most-- feel no obligation to gather evidence for their intuitive feelings, or to let empirical evidence inform their feelings. They don't think of intuitive feelings as predictions to be updated by Bayesian evidence; they treat their intuitive feelings as evidence. It's a common affair (at least in the United States) to see debaters use unsubstantiated intuitive feelings as linchpins of their arguments. It's even common on internet debates to see whole chains of reasoning in which every link is supported by gut feeling alone. This style of argument is not only unpersuasive to anyone who doesn't share those intuitions already-- it prevents the debater from updating, as long as his intuitions don't change.
7MichaelVassar14y
Intuitive feelings are evidence AND predictions. Sadly, most people simply think of them as facts.
0[anonymous]14y
Your argument reminds me of a thought experiment I did concerning the "GOD Operator... 1+ 1 = 2 1 -1 = 0 1 * 1 = 1 1 / 1 = 1 Etc... The operator is the +, -, * /, etc The GOD operator is inclusive of all known operators and allows such things as: 1 GOD 1 = whatever answer Fits AND 1 GOD 1 = sqrt(-1), PI, Etc.. How do we define the operator when GOD can be "whatever works"? My main thoughts then went to the idea of a "universal machine" much like Turing... What specifically is the mechanism of the human mind that would allow both of the above examples?
-7adsenanim14y

Upvoted for raising a very important topic.

It probably took me a bit more than 5 minutes, but I had conversation last night that fits this idea.

The idea to convey is "If you don't actually use the information you obtain, it cannot possibly increase your odds of success"

I went through the Monty Hall Problem (the trick to explaining that one is to generalize to the trillion box case where all but two boxes are eliminated prior to asking whether you want to switch) to get this idea across.

From there you can explain the implications. For example, how through commitment/consistency biases, con... (read more)

I think that the key words are "reasonably smart".

Sagan’s Baloney Detection Kit is a good starting point, and it could be said that each of his examples are easily translatable to a oration of less than 5 minutes (as per Candle in the Dark), I have often thought that it would make a good children’s book (Carl and the Baloney Detector)...

A good resource would be the previous attempts at such a work, Aesop's Fables (Platitudinal), I Ching (Esoteric), and Judeo-Christi-Islamic Texts (Dogmatic). If we are to attempt a similar work for the ideas of r... (read more)

1sketerpot14y
Do you know that these would be good resources? You haven't established this; it might help if you gave one or two examples of how these works of fiction that you listed could help us out. You'd need to specifically have brevity and low inferential distance as goals, if you made such a wiki. The LW wiki tends to give a brief description of something and then link to some long posts on the subject; in contrast, Wikipedia tends to have really long articles. Getting all those "many interpretations" you recommend takes quite a bit of space. Check out how long the Wikipedia article on confirmation bias is, and ask yourself if a hypothetical Average Person could take anything useful from skimming it.
0adsenanim14y
I present them (with my critique) because they represent to me attempts at reason as it was before the definition of reason was widely accepted. I left out any direct quotes out because I thought it may confuse the topic of conversation and that the five minute rule would be violated if I tried to discuss them. Aesop's Fables: http://www.aesopfables.com/aesopsel.html I Ching: http://en.calameo.com/read/000039257e56b7faf538d Judeo-Christi-Islamic: http://en.wikipedia.org/wiki/Kabbalah Maybe not the best resources, but they could be an introduction. I will add one more, only because I find it fun: http://en.wikipedia.org/wiki/Pyramids_(novel) For some reason the above link does not deliver correctly, but you should be able to follow.... Yes the wiki is a challenge, I was thinking of a new graphical interface...
0[anonymous]14y
http://en.wikipedia.org/wiki/Pyramids_(novel\) http://en.wikipedia.org/wiki/Pyramids_(novel\)
0[anonymous]14y
http://en.wikipedia.org/wiki/Pyramids_(novel)