In “What is Evidence?” I wrote:1
This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise. If your retina ended up in the same state regardless of what light entered it, you would be blind . . . Hence the phrase, “blind faith.” If what you believe doesn’t depend on what you see, you’ve been blinded as effectively as by poking out your eyeballs.
Cihan Baran replied:2
I can not conceive of a situation that would make 2 + 2 = 4 false. Perhaps for that reason, my belief in 2 + 2 = 4 is unconditional.
I admit, I cannot conceive of a “situation” that would make 2 + 2 = 4 false. (There are redefinitions, but those are not “situations,” and then you’re no longer talking about 2, 4, =, or +.) But that doesn’t make my belief unconditional. I find it quite easy to imagine a situation which would convince me that 2 + 2 = 3.
Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared—in contrast to my stored memory that 2 + 2 was supposed to equal 4. Moreover, when I visualized the process in my own mind, it seemed that making xx and xx come out to xxxx required an extra x to appear from nowhere, and was, moreover, inconsistent with other arithmetic I visualized, since subtracting xx from xxx left xx, but subtracting xx from xxxx left xxx. This would conflict with my stored memory that 3 - 2 = 1, but memory would be absurd in the face of physical and mental confirmation that xxx - xx = xx.
I would also check a pocket calculator, Google, and perhaps my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.” All of these would naturally show that the rest of the world agreed with my current visualization, and disagreed with my memory, that 2 + 2 = 3.
How could I possibly have ever been so deluded as to believe that 2 + 2 = 4? Two explanations would come to mind: First, a neurological fault (possibly caused by a sneeze) had made all the additive sums in my stored memory go up by one. Second, someone was messing with me, by hypnosis or by my being a computer simulation. In the second case, I would think it more likely that they had messed with my arithmetic recall than that 2 + 2 actually equalled 4. Neither of these plausible-sounding explanations would prevent me from noticing that I was very, very, very confused.3
What would convince me that 2 + 2 = 3, in other words, is exactly the same kind of evidence that currently convinces me that 2 + 2 = 4: The evidential crossfire of physical observation, mental visualization, and social agreement.
There was a time when I had no idea that 2 + 2 = 4. I did not arrive at this new belief by random processes—then there would have been no particular reason for my brain to end up storing “2 + 2 = 4” instead of “2 + 2 = 7.” The fact that my brain stores an answer surprisingly similar to what happens when I lay down two earplugs alongside two earplugs, calls forth an explanation of what entanglement produces this strange mirroring of mind and reality.
There’s really only two possibilities, for a belief of fact—either the belief got there via a mind-reality entangling process, or not. If not, the belief can’t be correct except by coincidence. For beliefs with the slightest shred of internal complexity (requiring a computer program of more than 10 bits to simulate), the space of possibilities is large enough that coincidence vanishes.4
Unconditional facts are not the same as unconditional beliefs. If entangled evidence convinces me that a fact is unconditional, this doesn’t mean I always believed in the fact without need of entangled evidence.
I believe that 2 + 2 = 4, and I find it quite easy to conceive of a situation which would convince me that 2 + 2 = 3. Namely, the same sort of situation that currently convinces me that 2 + 2 = 4. Thus I do not fear that I am a victim of blind faith.5
1See Map and Territory.
2Comment: http://lesswrong.com/lw/jl/what_is_evidence/f7h.
3See “Your Strength as a Rationalist” in Map and Territory.
4For more on belief formation and beliefs of fact, see “Feeling Rational” and “What Is Evidence?” in Map and Territory. For more on belief complexity, see “Occam’s Razor” in the same volume.
5If there are any Christians reading this who know Bayes’s Theorem, might I inquire of you what situation would convince you of the truth of Islam? Presumably it would be the same sort of situation causally responsible for producing your current belief in Christianity: We would push you screaming out of the uterus of a Muslim woman, and have you raised by Muslim parents who continually told you that it is good to believe unconditionally in Islam.
Or is there more to it than that? If so, what situation would convince you of Islam, or at least, non-Christianity? And how confident are you that the general kinds of evidence and reasoning you appeal to would have been enough to dissuade you of your religion if you had been raised a Muslim?
It's often poor form to quote oneself, but since this post (deservedly) continues to get visits, it might be good to bring up the line of thought that convinced me that this post made perfect sense:
The space of all possible minds includes some (aliens/mental patients/AIs) which have a notion of number and counting and an intuitive mental arithmetic, but where the last of these is skewed so that 2 and 2 really do seem to make 3 rather than 4. Not just lexically, but actually; the way that our brains can instantly subitize four objects as two distinct groups of two, their minds mistakenly "see" the pattern 0 0 0 as composed of two distinct 0 0 groups. Although such a mind would be unlikely to arise within natural selection, there's nothing impossible about engineering a mind with this error, or rewiring a mind within a simulation to have this error.
These minds, of course, would notice empirical contradictions everywhere: they would put two objects together with two more, count them, and then count four instead of three, when it's obvious by visualizing in their heads that two and two ought to make three instead. They would even encounter proofs that 2 + 2 =4, and be unable to find an error, although it's patently absurd to write SSSS0 = SS0 + SS0. Eventually, a sufficiently reflective and rational mind of this type might entertain the possibility that maybe two and two do actually make four, and that its system of visualization and mental arithmetic are in fact wrong, as obvious as they seem from the inside. We would consider such a mind to be more rational than one that decided that, no matter what it encountered, it could never be convinced that 2 and 2 made 4 rather than 3.
Now, given all that, why exactly should I refuse to ever update my arithmetical beliefs if given the sort of experiences in Eliezer's thought experiment? Wouldn't the hypothesis that I am such an agent get a lot of confirmation? (Of course, I very strongly don't expect to encounter such experiences, because of all the continuing evidence before me that 2 + 2 = 4; but if I did wake up in that situation, I'd have to accept that some part of my mind is probably broken, and the part that tells me 2 + 2 = 4 is as likely a candidate as any.)
Upon suddenly discovering that the whole world looks different this morning than it did last night is the rational belief "I guess I was deluded for my whole life up to this point" or "I guess I'm deluded now"?
Considering the fact that you're not waking up in a mental institution, but the world still seems to contain them (and if you get 2 sets of 2 of them, you have 3); the latter is a much more likely situation