I knew I was going to stay on LessWrong when I read the conceptually & rhetorically brilliant:
Ignorance exists in the map, not in the territory. If I am ignorant about a phenomenon, that is a fact about my own state of mind, not a fact about the phenomenon itself. A phenomenon can seem mysterious to some particular person. There are no phenomena which are mysterious of themselves. To worship a phenomenon because it seems so wonderfully mysterious, is to worship your own ignorance.
Which could be perhaps reduced to something like:
Your thoughts are your map; reality is the territory. Watch your step.
and
Mystery is always in the mind, never in the matter.
I don't think these are all that great but I would love a snappy way to express this central insight.
From Avoiding Your Belief's Real Weak Points:
"To do better: When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what smart people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole."
Condensed: If you catch yourself flinching away from a thought because it's painful, focus on that thought and don't let it go. If the truth hurts, it should.
This is, I think, some of the most important rationalist advice I ever got. It kept me reading OB when it was getting very painful to do so, and allowed me to finally admit that my religion was immoral, a thought I had kept tucked away in rationalization-land since middle school.
Hi. I'm new here. Great blog. Great post.
One maxim that I rely on for acting rationally is "know what your time is worth". In my first real job, I was on a one-week project with a scientist who told me that my time is valuable (I think he was implying that my boss was wasting my time). This really opened up my eyes. My first application of this idea was professionally -- I can get more out of my job than just a paycheck. I can learn skills and make contacts and list accomplishments that will advance my career. I can also enjoy what I do (I'm a researcher, so that's assumed in my profession). It's sad to see colleagues who think that their time is worth no more than some measly paycheck.
The second application of this rule was in my "home economy". I used to be very cheap. Now that I've placed a $ value on my time, it puts a lot of activities in perspective and I am much freer spending money when it frees up time for more worthwhile pursuits (it helps that my cheap habits assure that I always have a nice cushion of cash around. This way, I am able to spend money when needed, without reworking my budget -- which would be a real waste of my precious time). It's sad ...
Great idea for a post and an important topic. A somewhat similar topic came up at our recent Chicago meetup, when someone who saw our sign came up to us to ask us what Less Wrong referred to. We didn't necessarily have a great answer at the ready besides relaying some of the basics (website/group blog about rationality and thinking better, etc.). We spent a few minutes afterward talking about what information a good LW elevator speech might include. We didn't want it to sound too stilted/formal, e.g., "refining the art of human rationality" from the banner at the top doesn't sound that inviting in casual conversation. Does anyone have approaches that have worked?
Eliezers "Absence of evidence is evidence of absence" is a good one in my opinion, and relatively easy to explain the relevant maths to pretty much anyone.
The general point about Conservation of Expected Evidence is then likely to come out in the wash (and is a very useful idea).
A simple technique I used to use was that whenever I started to read or found a link for an article that made me uncomfortable or instinctively want to avoid it, I forced myself to read it. After a few times I got used to it and didn't have to do this anymore.
Should we be listing the oldies here as well? One of my favorites is still "Don't believe everything you think."
That one made it to book-title and t-shirt status, but I've never heard anyone actually say it. I've read it only a couple of times.
Okay, in that case, I had come up with with a saying to express that same idea but which makes the implications clearer. Here goes:
"Blindness isn't when you see nothing; it's when you see the same thing, regardless of what's in front of you.
"Foolishness isn't when your beliefs are wrong; it's when you believe the same thing, regardless of what you've seen."
I think it is mostly hopeless trying to teach rationality to most people.
For example, both of my parents studied Math in university and still have a very firm grip of the fundamentals.
I just got a phone call yesterday from my father in Germany saying: "We saw in the news, that a German tourist couple got killed in a shooting in San Francisco. Will you avoid going out after dark?" When I tried to explain that I won't update my risk estimates based on any such singular event, he seemed to listen to and understand formally what I said. Anyhow, he was completely unimpressed, finishing the conversation in an even more worried tone: "I see, but you will take care, won't you?"
Well said! Here's how Bruce Schneier put it:
Remember, if it’s in the news don’t worry about it. The very definition of news is “something that almost never happens.” When something is so common that it’s no longer news — car crashes, domestic violence — that’s when you should worry about it.
I wrote an essay about the utter irrationality of "stranger danger" based on that quote: http://messymatters.com/strangers
Your parents aren't saying "Please update your estimate of the probability of your violent death, based on this important new evidence."
The are saying, "I love you."
This has nothing to do with how rational or irrational they are.
They're saying "I love you" in an irrational way. This can hurt because there is no easy way to quibble with the second part and not violate cultural conventions about how to express your acceptance of the first.
This is well-understood by irrationalists. Once in a discussion about the necessity of evidence, I got landed with "But you don't demand evidence that your wife loves you, right? You just have faith..."
A clever move. Now arguing the point requires me to... deny that I have faith in my wife?
'Why would I need to demand evidence? My wife freely gives me evidence of her love, all the time!'
I had a similar discussion with a family member, about the existence of the Christian god, where I received that exact response. My wife was sitting right there. I responded with something along the lines of, "True, but my 'faith' in her love is already backed up by evidence, and besides, I have plenty of evidence that she exists. If there was evidence for God and evidence of His love, I would happily put faith in that too."
But I agree - it definitely caused me to pause to consider a tactful response.
Consider the possibility that when people say they're seeing things differently than you do, that they might be telling you the truth. They could be making it up, they could be just annoying you for the fun of it, but they might actually be weirder than you think.
Do you have any examples? That's a fascinating one.
(Corollary: if you're angry at someone, and they ask why you're angry, tell them. They might actually not know. Especially if they're a child. I know I'm not the only one who was punished by one or more elementary school teachers for reasons that they refused to explain, since they assumed that I already knew. Oh how I seethed.)
"How dare you disrespect my authority you little terr..."
You raise an interesting point here. When a parent or teacher imposes their authority on a child, there are two very different goals they could have:
To get the child to comply, and/or
To establish their own dominance.
When you ask why you're being ordered to do something, and you happen to be beneath the age that society considers you a real person, that's taken as an attack on the dominance of the person bossing you around. Obedience isn't enough; a lot of people won't be satisfied with anything less than unquestioning obedience, at least from mere children. I suspect that this is what people are thinking most of the time when they use "because I say so" as a 'reason' for something. (The rest of the time, they're probably using it because they're feeling too harried to explain something to a mere child, and so they trot out that tired old line because it's easy.)
I remember when I was young enough that adults dared to treat me that way. (Notice the emotionally charged phrasing? I'm still irritated.) Someone who gave reasonable orders and provided justifications for them on request, got cooperation...
providing a reason for your instructions doesn't hurt anything
I tend to agree in most cases. However, not all instruction-givers have good reasons for their orders. If they must provide such reasons before they are obeyed, and only inconsistently have them, that means that a plausible motive for their subordinates to question them is the desire not to follow the instruction. (i.e. subordinate thinks there might be no good reason, feels compelled to obey only if there is one, and is checking.) The motive associated in this way with asking for reasons is therefore considered harmful by the instruction-giver.
When I was a kid and got an unobjectionable but confusing order, I usually agreed first and then asked questions, sometimes while in the process of obeying. This tended to work better than standing there asking "Why?" and behaving like I wanted the world to come to a halt until I had my questions answered. Objectionable orders I treated differently, but I was aware when I challenged them that I was setting myself up for a power struggle.
These concerns can be balanced better than they usually are by using something like a "Merlin says" rule.
Not to omit the distinct (and surprising) possibility that YOU might be weirder than you think.
Don't ingest words from a poisoned discourse unless you have a concrete reason to think you're immune.
Politics is often poisoned deliberately. Other topics are sometimes poisoned accidentally, by concentrated confusion. Gibberish is toxic; if you bend your mind to make sense of it, your whole mind warps slightly. You see concentrated confusion every time you watch a science fiction show on television; their so-called science is actually made from mad libs. Examples are everywhere; do not assume that there is meaning beneath all confusion.
Here's the exact opposite advice. I wouldn't even bother posting it here except it's from one of the major rationalists of the 20th century:
"In studying a philosopher, the right attitude is neither reverence nor contempt, but first a kind of hypothetical sympathy, until it is possible to know what it feels like to believe in his theories, and only then a revival of the critical attitude, which should resemble, as far as possible, the state of mind of a person abandoning opinions which he has hitherto held.... Two things are to be remembered: that a man whose opinions and theories are worth studying may be presumed to have had some intelligence, but that no man is likely to have arrived at complete and final truth on any subject whatever. When an intelligent man expresses a view which seems to us obviously absurd, we should not attempt to prove that it is somehow true, but we should try to understand how it ever came to seem true. This exercise of historical and psychological imagination at once enlarges the scope of our thinking, and helps us to realize how foolish many of our own cherished prejudices will seem to an age which has a different temper of mind." -- Bertrand Russell, A History of Western Philosophy
The most important thing I learned from this site:
If you suspect something is factually true, don't be afraid to believe it. It can't hurt you.
That's simple. Not easy to implement, but easy to express.
Oh, I have the same thing. I do have some nearly disreputable views, and I have accidentally hurt people's feelings by airing them. (Pretty mild stuff: "Walmart's not so bad" and "Physical resurrection doesn't make sense.") Now I'm pretty much housebroken, although I worry like wedrifid that it shows in my facial expressions.
But. Would any of you really trade being well-informed for the convenience of not having to hold your tongue? I know I wouldn't.
Knowing whether disreputable beliefs are true is helpful in figuring out what intellectual institutions you can trust.
On a related note, I think too few people realize that it's OK to sometimes hold beliefs that are mistaken in a strongly disreputable direction. If all your errors fall on the reputable side of the line, you're missing out on accuracy. In a noisy world, sufficiently asymmetric suppression of falsehoods is indistinguishable from suppression of truths.
Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle
I like to reply to such accounts with "Luckier not to have been in the crash in the first place."
Whenever you're uncertain about an issue where bias might play a role, ask yourself honestly what you would say if you knew that if you gave the wrong answer, rabid leopards would storm into the room and eat you.
No method of explanation should be considered good unless it's been tested on a number of ordinary people.
My impression is that the best way to explain Goodhart's Law is to bring up employee incentive plans which don't have the effect the employer was hoping for.
Candidate 1: "If a trillion trillion trillion people each flip a hundred coins, someone's going to get all heads." ("If a trillion people each flip a billion coins" might be a stronger meme, though extremely inaccurate.)
Candidate 2: "Knowing the right answer is better than being the first to argue for it."
Candidate 3: "If it moves, you can test it."
I'd like to come up with something meme-sized about curiosity stoppers. How about:
When you pretend to know an answer, you're wrong twice.
It doesn't get the subtleties across, but it might be enough to gain a foothold in the average person's mind.
You might be underestimating just how much curiosity-stoppers feel like actually knowing an answer. I still catch myself reading Wikipedia articles just up to the point where they confirm what I thought. Your meme would have to imply just how difficult it is to notice this in yourself.
As I think has been mentioned in this thread by others, but bears repeating, if you want to convince people that they are affected by cognitive biases sometimes you have to really hit them over the head with them.
I've found the examples in Hindsight Devalues Science and the basketball video (I don't imagine there are many people here who haven't seen it, but I'll not post a spoiler just in case). Are particularly effective at this. I guess calibration tests would also be good on this metric. Once you've pointed out a "cortical illusion" like this...
Here's something that comes up in many, many discussions of climate change and anything else where a lot of arguments come from models or simulations: sometimes you have to do the math to make a valid (counter-)argument.
Example:
A: ...And so you, see, as CO2 increases, the mean global temperature will also increase.
B: That's bullshit, and here's why: as CO2 increases, there will be more photosynthesis -- and the increased plant growth will consume all that extra CO2.
Another example (the one that motivated this comment):
A: And so, as long as the bus is carry...
Think about your judgments of confidence in terms of frequencies instead of probabilities - our frequency intuitions tend to be much closer to reality. If you estimate that you're 90% sure of something, ask "if I faced ten similar problems, would I really get nine of them right?"
Counter-argument:
"Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique."
Possible alternative angle of attack - get people to read longer articles. Promote things that increase attention span, for example. Admittedly, you then...
I think "unknown unknowns" is a good one for this sort of thing. My attempt follows:
We know a lot of things, and generally we know that we know them. These are "known knowns." I know that 1+1 = 2, I know that the year is 2010, and so on.
We also don't know a lot of things, but generally we know that we don't know them-- for example, I don't know the hundredth digit of pi, I don't know how to speak Chinese, and I don't know what stocks are going to do well next year. All of those things are "known unknowns," or unanswered questi...
If I only have a few minutes, I tell people to study cognitive bias, in the hope that surely any intelligent person can see that understanding what science has to say about the systematic, predictable failings of our own brains can hardly fail to be useful. You need long enough to impart the caution that you have to apply these things to yourself, not just to other people...
*Candidate 1 requires intuitive understanding of probability, fat chance.
*Candidate 2 would require a rewiring of humans about the system how status is perceived.
*Candidate 3 just does not work. Talk with people about the image they have what scientists do all day. It is bad. Especially when you go to New Agers.
Maybe you assume that people have a consistent world view, or at least the desire to have one, but no. Please try the proposals on real people, and report back. I expect you to run into the problem that objective truth is widely not accepted, and ...
My idea would be to give a truncated version of a point made in Truly Part of You.
The different sound-bite ways to say it are:
Low inferential distance explanation: When learning about something, the most important thing is to notice what you've been told. Not understand, but notice: what kinds of things would you expect to see if you believed these claims, versus if you did not? Are yo...
"Wisdom is like a tree. Cut away the pages of predictions, the branches, even the roots, but from a single seed the whole structure can be rebuilt.
Foolishness is like a heap of stones. Stack them up however you please, paint bright colors to catch the eye, call them ancient or sacred or mysterious, and yet a child could scatter them."
Show someone the gorilla video, or another of the inattentional blindness tests.
Telling someone their brain is a collection of hacks and kludges is one thing; showing them, having them experience it, is on another level altogether.
Relatedly, my favorite quote from Egan's Permutation City: "You have to let me show you exactly what you are."
Another classic example of the brain's hackishness, which does not seem to have been mentioned here before, is the sentence, "More people have been to Russia than I have." If you say this sentence to someone (try it!), they'll at first claim that it was a perfectly reasonable, grammatical sentence. But then you ask them what it means, they'll start to say something, then stop, look confused, and laugh.
(Yes, there is a parsing of "have" as "possess", but this is (a) precluded by inflection, and (b) not ever what someone initially comes up with).
Anti-candidate: "Just because something feels good doesn't make it true."
The Litany of Tarski and Litany of Gendlin are better ways to approach this concept, because they're both inexorably first-person statements.
"Let's see how we can check this" or "let's see how we can test this" seems to work in the short run to get people to check or test things. I don't know if it changes habits.
RE: Candidate 1
For those interested, here's the math:
Count to ten.
Learning one was wrong (and updating) is a good thing.
One should be more interested in obtaining information than winning debates.
To clarify a little on candidate 1. People are often impressed by a coincidental or unlikely happening, and think that it's some kind of miracle. But in fact there are a lot of individually very unlikely things happening all the time. Out of all the cars in the world, what's the chance that you happen to see three of them in a particular order going down a particular street? Not that high, but obviously cars have to pass you in some order or other.
So all unlikely events can be categorised into unnoticable ones (any three cars at random), and noticable ones...
"one in a million chances happen a thousand times a day in China" is a bumper sticker phrase for that one I've found useful.
On my own, I've tried out the ol' medical test base rate fallacy explanation on a few people. My dad got it right away; so did one friend; another didn't seem to fully grok it within ~2 minutes of explanation. I haven't done any follow-ups to see if they've been able to retain and use the concept.
(I definitely should have thought of this earlier; interestingly enough it was this comment that was the trigger.)
Use probabilities! (Or likelihood ratios.) Especially when arguing. Yes, do so with care, i.e. without deceiving yourself into thinking you're better calibrated than you are -- but hiding the fact that you're not perfectly calibrated doesn't make your calibration any better. You brain is still making the same mistakes whether you choose to make them verbally explicit or not. So instead of reacting with indignation when someone disagrees, just a...
Terry Pratchett has a good metaphor for a good way of thinking in his Tiffany Aching books. Second, third, etc thoughts. Basically the idea that you shouldn't just trust whatever your thoughts say, you have your second thoughts monitoring them. And then you have your third thoughts monitoring that. I've always found it extremely helpful to pay attention to what I'm thinking; many times I've noticed a thought slipping past that is very obviously wrong and picked myseld up on it. A few times I've even agreed with the original thought upon further analysis, b...
I think movements grow at their margins. It is there that we can make the greatest impact. So perhaps we should focus on recent converts to rationality. Figure out how they converted, what factors were involved, how the transition could have been easier, taking into account their personality etc.
This is what I have been trying to do with the people I introduce rationality to and who are somewhat receptive. It is not only a victory that they began to accept rationality. It was also an opportunity to observe how best to foster more of such conversions.
It is ...
I've heard Candidate 1 expressed as "A one-in-a-million shot happens a thousand times a day in China."
Candidate 2 could be "I like to be less wrong."
Candidate 3 maybe "If it affects reality, it is real."?
That Candidate 2 (admitting that one is wrong is a win for an argument), is one of my oldest bits of helpful knowledge.
If one admits that one is wrong, one instantly ceases to be wrong (or at lest ceases to be wrong in the way that one was wrong. It could still be the case that the other person in an argument is also wrong, but for the purposes of this point, we are assuming that they are "correct"), because one is then in possession of more accurate (i.e. "right") information/knowledge.
http://rejectiontherapy.com/ - the 30 day rejection challenge seems to fit here. Try and, for 30 consecutive days, provoke genuine rejections or denials of reasonable requests, as part of your regular activities, at the rate of one per day.
Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.
That's saying it's surprising that nobody lives to the age of 150. Miracle cancer cures are statistical outliers and it would be interest to know the mechanism that allows them to happen.
This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured.
It's no contraction if you believe in a clever god that doesn't want that the effect gets scientifically measured.
I'm surprised there aren't any comments about reminding people they can't have it both ways. I haven't found a great way to do it quickly, but I have sometimes talked people down from forming a negative opinion (of a person, group, or event) by asking them if they would have gotten the same perception from a counterfactual (and in some sense opposite) event occurring instead.
Candidate: Hold off on proposing solutions.
This article is way more useful than the slogan alone, and it's short enough to read in five minutes.
'Instinct,' 'intuition,' 'gut feeling,' etc. are all close synonyms for 'best guess.' That's why they tend to be the weakest links in an argument-- they're just guesses, and guesses are often wrong. Guessing is useful for brainstorming, but if you really believe something, you should have more concrete evidence than a guess. And the more you base a belief on guesses, the more likely that belief is to be wrong.
Substantiate your guesses with empirical evidence. Start with a guess, but end with a test.
It probably took me a bit more than 5 minutes, but I had conversation last night that fits this idea.
The idea to convey is "If you don't actually use the information you obtain, it cannot possibly increase your odds of success"
I went through the Monty Hall Problem (the trick to explaining that one is to generalize to the trillion box case where all but two boxes are eliminated prior to asking whether you want to switch) to get this idea across.
From there you can explain the implications. For example, how through commitment/consistency biases, con...
I think that the key words are "reasonably smart".
Sagan’s Baloney Detection Kit is a good starting point, and it could be said that each of his examples are easily translatable to a oration of less than 5 minutes (as per Candle in the Dark), I have often thought that it would make a good children’s book (Carl and the Baloney Detector)...
A good resource would be the previous attempts at such a work, Aesop's Fables (Platitudinal), I Ching (Esoteric), and Judeo-Christi-Islamic Texts (Dogmatic). If we are to attempt a similar work for the ideas of r...
Less Wrong tends toward long articles with a lot of background material. That's great, but the vast majority of people will never read them. What would be useful for raising the sanity waterline in the general population is a collection of simple-but-useful rationality techniques that you might be able to teach to a reasonably smart person in five minutes or less per technique.
Carl Sagan had a slogan: "Extraordinary claims require extraordinary evidence." He would say this phrase and then explain how, when someone claims something extraordinary (i.e. something for which we have a very low probability estimate), they need correspondingly stronger evidence than if they'd made a higher-likelihood claim, like "I had a sandwich for lunch." We can talk about this very precisely, in terms of Bayesian updating and conditional probability, but Sagan was able to get a lot of this across to random laypeople in about a minute. Maybe two minutes.
What techniques for rationality can be explained to a normal person in under five minutes? I'm looking for small and simple memes that will make people more rational, on average. Here are some candidates, to get the discussion started:
Candidate 1 (suggested by DuncanS): Unlikely events happen all the time. Someone gets in a car-crash and barely misses being impaled by a metal pole, and people say it's a million-to-one miracle -- but events occur all the time that are just as unlikely. If you look at how many highly unlikely things could happen, and how many chances they have to happen, then it's obvious that we're going to see "miraculous" coincidences, purely by chance. Similarly, with millions of people dying of cancer each year, there are going to be lots of people making highly unlikely miracle recoveries. If they didn't, that would be surprising.
Candidate 2: Admitting that you were wrong is a way of winning an argument. (The other person wins, too.) There's a saying that "It takes a big man to admit he's wrong," and when people say this, they don't seem to realize that it's a huge problem! It shouldn't be hard to admit that you were wrong about something! It shouldn't feel like defeat; it should feel like success. When you lose an argument with someone, it should be time for high fives and mutual jubilation, not shame and anger. The hard part of retraining yourself to think this way is just realizing that feeling good about conceding an argument is even an option.
Candidate 3: Everything that has an effect in the real world is part of the domain of science (and, more broadly, rationality). A lot of people have the truly bizarre idea that some theories are special, immune to whatever standards of evidence they may apply to any other theory. My favorite example is people who believe that prayers for healing actually make people who are prayed for more likely to recover, but that this cannot be scientifically tested. This is an obvious contradiction: they're claiming a measurable effect on the world and then pretending that it can't possibly be measured. I think that if you pointed out a few examples of this kind of special pleading to people, they might start to realize when they're doing it.
Anti-candidate: "Just because something feels good doesn't make it true." I call this an anti-candidate because, while it's true, it's seldom helpful. People trot out this line as an argument against other people's ideas, but rarely apply it to their own. I want memes that will make people actually be more rational, instead of just feeling that way.
This was adapted from an earlier discussion in an Open Thread. One suggestion, based on the comments there: if you're not sure whether something can be explained quickly, just go for it! Write a one-paragraph explanation, and try to keep the inferential distances short. It's good practice, and if we can come up with some really catchy ones, it might be a good addition to the wiki. Or we could use them as rationalist propaganda, somehow. There are a lot of great ideas on Less Wrong that I think can and should spread beyond the usual LW demographic.