In “What is Evidence?” I wrote:1
This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise. If your retina ended up in the same state regardless of what light entered it, you would be blind . . . Hence the phrase, “blind faith.” If what you believe doesn’t depend on what you see, you’ve been blinded as effectively as by poking out your eyeballs.
Cihan Baran replied:2
I can not conceive of a situation that would make 2 + 2 = 4 false. Perhaps for that reason, my belief in 2 + 2 = 4 is unconditional.
I admit, I cannot conceive of a “situation” that would make 2 + 2 = 4 false. (There are redefinitions, but those are not “situations,” and then you’re no longer talking about 2, 4, =, or +.) But that doesn’t make my belief unconditional. I find it quite easy to imagine a situation which would convince me that 2 + 2 = 3.
Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared—in contrast to my stored memory that 2 + 2 was supposed to equal 4. Moreover, when I visualized the process in my own mind, it seemed that making xx and xx come out to xxxx required an extra x to appear from nowhere, and was, moreover, inconsistent with other arithmetic I visualized, since subtracting xx from xxx left xx, but subtracting xx from xxxx left xxx. This would conflict with my stored memory that 3 - 2 = 1, but memory would be absurd in the face of physical and mental confirmation that xxx - xx = xx.
I would also check a pocket calculator, Google, and perhaps my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.” All of these would naturally show that the rest of the world agreed with my current visualization, and disagreed with my memory, that 2 + 2 = 3.
How could I possibly have ever been so deluded as to believe that 2 + 2 = 4? Two explanations would come to mind: First, a neurological fault (possibly caused by a sneeze) had made all the additive sums in my stored memory go up by one. Second, someone was messing with me, by hypnosis or by my being a computer simulation. In the second case, I would think it more likely that they had messed with my arithmetic recall than that 2 + 2 actually equalled 4. Neither of these plausible-sounding explanations would prevent me from noticing that I was very, very, very confused.3
What would convince me that 2 + 2 = 3, in other words, is exactly the same kind of evidence that currently convinces me that 2 + 2 = 4: The evidential crossfire of physical observation, mental visualization, and social agreement.
There was a time when I had no idea that 2 + 2 = 4. I did not arrive at this new belief by random processes—then there would have been no particular reason for my brain to end up storing “2 + 2 = 4” instead of “2 + 2 = 7.” The fact that my brain stores an answer surprisingly similar to what happens when I lay down two earplugs alongside two earplugs, calls forth an explanation of what entanglement produces this strange mirroring of mind and reality.
There’s really only two possibilities, for a belief of fact—either the belief got there via a mind-reality entangling process, or not. If not, the belief can’t be correct except by coincidence. For beliefs with the slightest shred of internal complexity (requiring a computer program of more than 10 bits to simulate), the space of possibilities is large enough that coincidence vanishes.4
Unconditional facts are not the same as unconditional beliefs. If entangled evidence convinces me that a fact is unconditional, this doesn’t mean I always believed in the fact without need of entangled evidence.
I believe that 2 + 2 = 4, and I find it quite easy to conceive of a situation which would convince me that 2 + 2 = 3. Namely, the same sort of situation that currently convinces me that 2 + 2 = 4. Thus I do not fear that I am a victim of blind faith.5
1See Map and Territory.
2Comment: http://lesswrong.com/lw/jl/what_is_evidence/f7h.
3See “Your Strength as a Rationalist” in Map and Territory.
4For more on belief formation and beliefs of fact, see “Feeling Rational” and “What Is Evidence?” in Map and Territory. For more on belief complexity, see “Occam’s Razor” in the same volume.
5If there are any Christians reading this who know Bayes’s Theorem, might I inquire of you what situation would convince you of the truth of Islam? Presumably it would be the same sort of situation causally responsible for producing your current belief in Christianity: We would push you screaming out of the uterus of a Muslim woman, and have you raised by Muslim parents who continually told you that it is good to believe unconditionally in Islam.
Or is there more to it than that? If so, what situation would convince you of Islam, or at least, non-Christianity? And how confident are you that the general kinds of evidence and reasoning you appeal to would have been enough to dissuade you of your religion if you had been raised a Muslim?
Let possible states of the world be represented by A, B, C, etc. Let's say A is true.
An agent that decides to believe that the world is represented by the theory that comes earliest alphabetically will be fortunate as it will believe true things, but it isn't discerning at all.
An agent that believes the contents of books when it reads the book's chapters in sequential order and disbelieves the contents of books when it chooses to read the chapters in reverse order is not an agent designed to discern truth, however lucky it gets deciding how to read each book it reads.
I'm just trying to ask to what extent you don't resemble an optimal thinker in this particular way no human totally succeeds at, one possibility would be for you to deny that this human tendency is a flaw. Some people may disproportionately be influenced by the last book they read, others by the first, others by the one's with nice covers, etc.. All I'm trying to get at is to see if you agree it's bad to be a decider that is influenced by the order it gets information in (except for to the extent the order constitutes information, but this isn't really an exception).
Someone could claim that truth of a proposition is commensurate with the age of the oldest book containing it, and such a person would not mean what anyone else means by "truth", and would be wrong to the extent they are trying to communicate.
Likewise truth isn't usually bound to the order of evidence. If I read a pamphlet advocating Islam, and then one advocating Mormonism, I ought to reach the same exact conclusions as if I had read them in the other order. If I don't, I may happen to come to believe the correct thing, but this is true of any decision process, even the alphabetical one.
In the first two quotes above, you seem to disagree with what I say, in the latter two, you seem to agree.
The confusion, I reckon, comes from my inability to step outside myself. I am not a perfect rationalist; I am trapped to an extent by the concepts taught to me since birth, just as I find myself uncomfortable with my gender identity due to growing up in an abusive household. It is difficult to step outside one's own biases. So yes, my bias may be irreparable. As for "unfortunate", the odds of it being an unfortunate bias are exactly the odds of Mormonism being true. If I believe the truth, then I am fortunate. It is the chance that my bias is unf... (read more)