In “What is Evidence?” I wrote:1
This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise. If your retina ended up in the same state regardless of what light entered it, you would be blind . . . Hence the phrase, “blind faith.” If what you believe doesn’t depend on what you see, you’ve been blinded as effectively as by poking out your eyeballs.
Cihan Baran replied:2
I can not conceive of a situation that would make 2 + 2 = 4 false. Perhaps for that reason, my belief in 2 + 2 = 4 is unconditional.
I admit, I cannot conceive of a “situation” that would make 2 + 2 = 4 false. (There are redefinitions, but those are not “situations,” and then you’re no longer talking about 2, 4, =, or +.) But that doesn’t make my belief unconditional. I find it quite easy to imagine a situation which would convince me that 2 + 2 = 3.
Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared—in contrast to my stored memory that 2 + 2 was supposed to equal 4. Moreover, when I visualized the process in my own mind, it seemed that making xx and xx come out to xxxx required an extra x to appear from nowhere, and was, moreover, inconsistent with other arithmetic I visualized, since subtracting xx from xxx left xx, but subtracting xx from xxxx left xxx. This would conflict with my stored memory that 3 - 2 = 1, but memory would be absurd in the face of physical and mental confirmation that xxx - xx = xx.
I would also check a pocket calculator, Google, and perhaps my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.” All of these would naturally show that the rest of the world agreed with my current visualization, and disagreed with my memory, that 2 + 2 = 3.
How could I possibly have ever been so deluded as to believe that 2 + 2 = 4? Two explanations would come to mind: First, a neurological fault (possibly caused by a sneeze) had made all the additive sums in my stored memory go up by one. Second, someone was messing with me, by hypnosis or by my being a computer simulation. In the second case, I would think it more likely that they had messed with my arithmetic recall than that 2 + 2 actually equalled 4. Neither of these plausible-sounding explanations would prevent me from noticing that I was very, very, very confused.3
What would convince me that 2 + 2 = 3, in other words, is exactly the same kind of evidence that currently convinces me that 2 + 2 = 4: The evidential crossfire of physical observation, mental visualization, and social agreement.
There was a time when I had no idea that 2 + 2 = 4. I did not arrive at this new belief by random processes—then there would have been no particular reason for my brain to end up storing “2 + 2 = 4” instead of “2 + 2 = 7.” The fact that my brain stores an answer surprisingly similar to what happens when I lay down two earplugs alongside two earplugs, calls forth an explanation of what entanglement produces this strange mirroring of mind and reality.
There’s really only two possibilities, for a belief of fact—either the belief got there via a mind-reality entangling process, or not. If not, the belief can’t be correct except by coincidence. For beliefs with the slightest shred of internal complexity (requiring a computer program of more than 10 bits to simulate), the space of possibilities is large enough that coincidence vanishes.4
Unconditional facts are not the same as unconditional beliefs. If entangled evidence convinces me that a fact is unconditional, this doesn’t mean I always believed in the fact without need of entangled evidence.
I believe that 2 + 2 = 4, and I find it quite easy to conceive of a situation which would convince me that 2 + 2 = 3. Namely, the same sort of situation that currently convinces me that 2 + 2 = 4. Thus I do not fear that I am a victim of blind faith.5
1See Map and Territory.
2Comment: http://lesswrong.com/lw/jl/what_is_evidence/f7h.
3See “Your Strength as a Rationalist” in Map and Territory.
4For more on belief formation and beliefs of fact, see “Feeling Rational” and “What Is Evidence?” in Map and Territory. For more on belief complexity, see “Occam’s Razor” in the same volume.
5If there are any Christians reading this who know Bayes’s Theorem, might I inquire of you what situation would convince you of the truth of Islam? Presumably it would be the same sort of situation causally responsible for producing your current belief in Christianity: We would push you screaming out of the uterus of a Muslim woman, and have you raised by Muslim parents who continually told you that it is good to believe unconditionally in Islam.
Or is there more to it than that? If so, what situation would convince you of Islam, or at least, non-Christianity? And how confident are you that the general kinds of evidence and reasoning you appeal to would have been enough to dissuade you of your religion if you had been raised a Muslim?
I, at least, was not suggesting that you don't know the difference, merely that your article failed to take account of the difference and was therefore confusing and initially unconvincing to me because I was taking account of that difference.
However (and it took me too damn long to realise this; I can't wait for Logic and Set Theory this coming year), I wasn't talking about "models" in the sense that pebbles are a Model of the Theory PA. I was talking in the sense that PA is a model of the behaviour observed in pebbles. If PA fails to model pebbles, that doesn't mean PA is wrong, it just means that pebbles don't follow PA. If a Model of PA exists in which SS0+SS0 = SSS0, then the Theory PA materially cannot prove that SS0+SS0 ≠ SSS0, and if such a proof has been constructed from the axiomata of the Theory then either the proof is in error (exists a step not justified by the inference rules), or the combination of axiomata and inference rules contains a contradiction (which can be rephrased as "under these inference rules, the Theory is not consistent"), or the claimed Model is not in fact a Model at all (in which case one of the axiomata does not, in fact, apply to it).
I should probably write down what I think I know about the epistemic status of mathematics and why I think I know it, because I'm pretty sure I disagree quite strongly with you (and my prior probability of me being right and you being wrong is rather low).
Scientists and mathematicians use the word "model" in exactly opposite ways. This is occasionally confusing.