I talked about a style of reasoning in which not a single contrary argument is allowed, with the result that every non-supporting observation has to be argued away. Here I suggest that when people encounter a contrary argument, they prevent themselves from downshifting their confidence by rehearsing already-known support.
Suppose the country of Freedonia is debating whether its neighbor, Sylvania, is responsible for a recent rash of meteor strikes on its cities. There are several pieces of evidence suggesting this: the meteors struck cities close to the Sylvanian border; there was unusual activity in the Sylvanian stock markets before the strikes; and the Sylvanian ambassador Trentino was heard muttering about “heavenly vengeance.”
Someone comes to you and says: “I don’t think Sylvania is responsible for the meteor strikes. They have trade with us of billions of dinars annually.” “Well,” you reply, “the meteors struck cities close to Sylvania, there was suspicious activity in their stock market, and their ambassador spoke of heavenly vengeance afterward.” Since these three arguments outweigh the first, you keep your belief that Sylvania is responsible—you believe rather than disbelieve, qualitatively. Clearly, the balance of evidence weighs against Sylvania.
Then another comes to you and says: “I don’t think Sylvania is responsible for the meteor strikes. Directing an asteroid strike is really hard. Sylvania doesn’t even have a space program.” You reply, “But the meteors struck cities close to Sylvania, and their investors knew it, and the ambassador came right out and admitted it!” Again, these three arguments outweigh the first (by three arguments against one argument), so you keep your belief that Sylvania is responsible.
Indeed, your convictions are strengthened. On two separate occasions now, you have evaluated the balance of evidence, and both times the balance was tilted against Sylvania by a ratio of 3 to 1.
You encounter further arguments by the pro-Sylvania traitors—again, and again, and a hundred times again—but each time the new argument is handily defeated by 3 to 1. And on every occasion, you feel yourself becoming more confident that Sylvania was indeed responsible, shifting your prior according to the felt balance of evidence.
The problem, of course, is that by rehearsing arguments you already knew, you are double-counting the evidence. This would be a grave sin even if you double-counted all the evidence. (Imagine a scientist who does an experiment with 50 subjects and fails to obtain statistically significant results, so the scientist counts all the data twice.)
But to selectively double-count only some evidence is sheer farce. I remember seeing a cartoon as a child, where a villain was dividing up loot using the following algorithm: “One for you, one for me. One for you, one-two for me. One for you, one-two-three for me.”
As I emphasized in the last essay, even if a cherished belief is true, a rationalist may sometimes need to downshift the probability while integrating all the evidence. Yes, the balance of support may still favor your cherished belief. But you still have to shift the probability down—yes, down—from whatever it was before you heard the contrary evidence. It does no good to rehearse supporting arguments, because you have already taken those into account.
And yet it does appear to me that when people are confronted by a new counterargument, they search for a justification not to downshift their confidence, and of course they find supporting arguments they already know. I have to keep constant vigilance not to do this myself! It feels as natural as parrying a sword-strike with a handy shield.
With the right kind of wrong reasoning, a handful of support—or even a single argument—can stand off an army of contradictions.
"Constant, virtually any modern hot-button political issue will do."
Actually, I think it would be pretty hard to come up with unambiguous examples of this, because what you're describing is not misbehavior that occurs in any given encounter, but a pattern over time in which an individual changes his own beliefs in the wrong way in response to the evidence. This is hard to demonstrate for at least two reasons. First, since it occurs over time rather than on a single occasion it's difficult to observe. Second, since what you're really talking about (the revision of one's beliefs) occurs inside a person's head there's the problem of gaining access to the person's head.
But if it is difficult to come up with unambiguous examples of it, then by the same token it is hard to observe in the first place. Any supposed observation of it will almost certainly require a large element of speculation about what is going on inside someone else's head.
What can we actually observe? Relevant to what you describe, we can observe two things:
1) We know generally that people's political views often harden over time. And since they do it in different directions, then in at least some cases the hardening is unlikely to be occurring for the right (the rational truth-seeking) reasons.
2) People do observably rehearse already-known support.
But (2) in itself is perfectly legitimate. Meanwhile (1) already has many explanations apart from the phenomenon that you are speculating exists. It's a much observed and much talked about phenomenon, and what you have done here is added only one more speculation about why it happens to the bulging library of explanations. While you are not necessarily wrong, at the same time as far as I can see there isn't all that much compelling evidence in favor of your speculation.