Over the past few years, we have discreetly approached colleagues faced with a choice between job offers, and asked them to estimate the probability that they will choose one job over another. The average confidence in the predicted choice was a modest 66%, but only 1 of the 24 respondents chose the option to which he or she initially assigned a lower probability, yielding an overall accuracy rate of 96%.
—Dale Griffin and Amos Tversky1
When I first read the words above—on August 1st, 2003, at around 3 o’clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can’t yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn’t justify it, reject it.
But we change our minds less often—much less often—than we think.
I’m sure that you can think of at least one occasion in your life when you’ve changed your mind. We all can. How about all the occasions in your life when you didn’t change your mind? Are they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera, et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it’s probably going to stay there.
1Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” Cognitive Psychology 24, no. 3 (1992): 411–435.
Since gwern is, well beyond what I thought was typical of him, refusing to call a horse a horse, I'm going to say it: man, you're so lame.
First understand that I noticed a disagreement going on between someone I've never seen before, a Mr. Peacewise, and a Mr. Gwern, whom I despise ever so lightly. He's a jerk on IRC, you see. It would have made me feel better for him to be wrong and you to be right, you see. I wanted that, in my gut (though not by Tarski).
But man. Gwern posts an article with a perfectly reasonable conclusion attached, and you take a slice of anecdotal evidence to say just the opposite, which just happens to precisely match your preconceptions, and then in the ensuing discussion, instead of recognizing this, you accuse gwern first of being insulting and then of being labored by cognitive biases, meanwhile with no evidence, literally none, that you are right and he is wrong.
LAME.
I'm just going to point out that I'm surprised that this comment of mine has not only been voted up, but voted up quite strongly, considering it rests in a nest of 'hidden' comments. I fully expected this comment to rest somewhere in the neighborhood of -2 karma.
I'm not sure whether to call this a pleasant surprise. I'm really just confused.