Over the past few years, we have discreetly approached colleagues faced with a choice between job offers, and asked them to estimate the probability that they will choose one job over another. The average confidence in the predicted choice was a modest 66%, but only 1 of the 24 respondents chose the option to which he or she initially assigned a lower probability, yielding an overall accuracy rate of 96%.
—Dale Griffin and Amos Tversky1
When I first read the words above—on August 1st, 2003, at around 3 o’clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can’t yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn’t justify it, reject it.
But we change our minds less often—much less often—than we think.
I’m sure that you can think of at least one occasion in your life when you’ve changed your mind. We all can. How about all the occasions in your life when you didn’t change your mind? Are they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera, et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it’s probably going to stay there.
1Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” Cognitive Psychology 24, no. 3 (1992): 411–435.
Kahneman wrote, in Thinking, Fast and Slow, that he wrote a book for gossips and critics — rather than a book for movers and shakers — because people are better at identifying other people's biases than their own. I took this as meaning that his intention was to make his readers better equipped to criticize others' biases correctly; and thus, to make people who wish to avoid being criticized need to debias themselves to accomplish this.
Presumably, part of the reason that a commenter would avoid making a dictionary argument on LW is if that commenter knows that LWers are unlikely to tolerate dictionary arguments. Teaching people about biases may lead them to be less tolerant of biases in others; and if we seek to avoid doing things that are odious to our fellows, we will be forced to check our own biases before someone else checks them for us.
Knowing about biases can hurt you chiefly if you're the only one who's sophisticated about biases and can argue fluently about them. But we should expect that in an environment with a raised sanity waterline, where everyone knows about biases and is prepared to point them out, people will perpetrate less egregious bias than in an environment where they can get away with it socially.
(OTOH, I don't take this to excuse people saying "Nah nah nah, I caught you in a conjunction fallacy, you're a poopy stupid head." We should be intolerant of biased arguments, not of people who make them — so long as they're learning.)
Good point. I normally don't like accusing others of bias, and I will continue to try to refrain from doing so when I'm involved in something that looks like a debate, but I agree that it is useful information that should not be discouraged.