In “What is Evidence?” I wrote:1
This is why rationalists put such a heavy premium on the paradoxical-seeming claim that a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise. If your retina ended up in the same state regardless of what light entered it, you would be blind . . . Hence the phrase, “blind faith.” If what you believe doesn’t depend on what you see, you’ve been blinded as effectively as by poking out your eyeballs.
Cihan Baran replied:2
I can not conceive of a situation that would make 2 + 2 = 4 false. Perhaps for that reason, my belief in 2 + 2 = 4 is unconditional.
I admit, I cannot conceive of a “situation” that would make 2 + 2 = 4 false. (There are redefinitions, but those are not “situations,” and then you’re no longer talking about 2, 4, =, or +.) But that doesn’t make my belief unconditional. I find it quite easy to imagine a situation which would convince me that 2 + 2 = 3.
Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared—in contrast to my stored memory that 2 + 2 was supposed to equal 4. Moreover, when I visualized the process in my own mind, it seemed that making xx and xx come out to xxxx required an extra x to appear from nowhere, and was, moreover, inconsistent with other arithmetic I visualized, since subtracting xx from xxx left xx, but subtracting xx from xxxx left xxx. This would conflict with my stored memory that 3 - 2 = 1, but memory would be absurd in the face of physical and mental confirmation that xxx - xx = xx.
I would also check a pocket calculator, Google, and perhaps my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.” All of these would naturally show that the rest of the world agreed with my current visualization, and disagreed with my memory, that 2 + 2 = 3.
How could I possibly have ever been so deluded as to believe that 2 + 2 = 4? Two explanations would come to mind: First, a neurological fault (possibly caused by a sneeze) had made all the additive sums in my stored memory go up by one. Second, someone was messing with me, by hypnosis or by my being a computer simulation. In the second case, I would think it more likely that they had messed with my arithmetic recall than that 2 + 2 actually equalled 4. Neither of these plausible-sounding explanations would prevent me from noticing that I was very, very, very confused.3
What would convince me that 2 + 2 = 3, in other words, is exactly the same kind of evidence that currently convinces me that 2 + 2 = 4: The evidential crossfire of physical observation, mental visualization, and social agreement.
There was a time when I had no idea that 2 + 2 = 4. I did not arrive at this new belief by random processes—then there would have been no particular reason for my brain to end up storing “2 + 2 = 4” instead of “2 + 2 = 7.” The fact that my brain stores an answer surprisingly similar to what happens when I lay down two earplugs alongside two earplugs, calls forth an explanation of what entanglement produces this strange mirroring of mind and reality.
There’s really only two possibilities, for a belief of fact—either the belief got there via a mind-reality entangling process, or not. If not, the belief can’t be correct except by coincidence. For beliefs with the slightest shred of internal complexity (requiring a computer program of more than 10 bits to simulate), the space of possibilities is large enough that coincidence vanishes.4
Unconditional facts are not the same as unconditional beliefs. If entangled evidence convinces me that a fact is unconditional, this doesn’t mean I always believed in the fact without need of entangled evidence.
I believe that 2 + 2 = 4, and I find it quite easy to conceive of a situation which would convince me that 2 + 2 = 3. Namely, the same sort of situation that currently convinces me that 2 + 2 = 4. Thus I do not fear that I am a victim of blind faith.5
1See Map and Territory.
2Comment: http://lesswrong.com/lw/jl/what_is_evidence/f7h.
3See “Your Strength as a Rationalist” in Map and Territory.
4For more on belief formation and beliefs of fact, see “Feeling Rational” and “What Is Evidence?” in Map and Territory. For more on belief complexity, see “Occam’s Razor” in the same volume.
5If there are any Christians reading this who know Bayes’s Theorem, might I inquire of you what situation would convince you of the truth of Islam? Presumably it would be the same sort of situation causally responsible for producing your current belief in Christianity: We would push you screaming out of the uterus of a Muslim woman, and have you raised by Muslim parents who continually told you that it is good to believe unconditionally in Islam.
Or is there more to it than that? If so, what situation would convince you of Islam, or at least, non-Christianity? And how confident are you that the general kinds of evidence and reasoning you appeal to would have been enough to dissuade you of your religion if you had been raised a Muslim?
Edit: I meant to cover this point first, but I left it out before.
This really isn't how it works. Absence of evidence is evidence of strength proportional to the expectation of evidence if a given proposition is true. So if, for example, you propose that there is an elephant in a room, and then you investigate the room and see no sign of an elephant, then that is very strong evidence that there is no elephant in the room. But if you propose that there is a mouse in a room, and you investigate and see no sign of the mouse, then that is only weak evidence that there is no mouse. You will have to update your confidence that there is a mouse in the room downwards, but much, much less than you had to update in the case of the elephant.
In both the case of the elephant and the mouse, actually observing the elephant or mouse would be extremely strong evidence; you could still be wrong if you were hallucinating or someone had contrived an extremely clever way of creating an illusion of either, but it would still force you to greatly strengthen your probability estimate for an elephant or mouse being in the room. It's psychologically compelling to try to generalize this into an broad principle, that positive evidence is always stronger, but in fact as with the case of the elephant, negative evidence can reach arbitrarily high strengths depending on how strong the expectation of evidence is. Likewise, positive evidence can reach arbitrarily low strengths depending on how likely it is that the observation would be forthcoming without the proposition being true. For instance, if an alleged psychic describes a crime scene, and the police confirm that the description is accurate, this is not strong evidence that the psychic had any sort of vision of the scene if their description is statistically likely to apply to any crime scene of that type.
The defenses you've linked are extremely weak. Apologists of any religion can rationalize this degree of agreement with evidence, but the fact remains that given what we know about Mesoamerican civilization, the Book of Mormon does not remotely resemble what we would expect a legitimate text from that time and place to be like, the most we can say is that it is not strictly impossible for it to be so.
If you're already strongly invested in a religious narrative being true, then something like
may seem like an adequate defense, but a person who is merely impartial to the religion will simply ask "How likely is that?" Well, given that when animals are raised domestically for food like cattle, archaeologists can consistently find concentrations of their remains in human settlements along with food refuse, and there is no evidence whatsoever of bison being domesticated in Mesoamerica, or anywhere in premodern America at all, and besides which this was over twenty years after the Lewis and Clark expedition and Joseph Smith should have been quite aware of the existence of buffalo, the answer seems to be "very unlikely". Other defenses given on that page are similarly uncompelling.
I recommend checking out this article. It's about martial arts, but it generalizes extremely well. Once you become personally invested in a set of beliefs, your demands for arguments in its defense will be much weaker than a person without the same investment. Works of apologetics such as the ones you've linked may satisfy a believer to keep their package of beliefs, but this is very different from singling them out to an impartial individual to adopt them.
Having read a considerable number of works of apologetics for various religions, I cannot say that Mormonism stands out for having an atypical degree of support. It is at best typical, and the evidential standards among religions are already extremely low.
Your point is well taken, and I will meditate upon it. Thank you.