It ain’t a true crisis of faith unless things could just as easily go either way.
—Thor Shenkel
Many in this world retain beliefs whose flaws a ten-year-old could point out, if that ten-year-old were hearing the beliefs for the first time. These are not subtle errors we’re talking about. They would be child's play for an unattached mind to relinquish, if the skepticism of a ten-year-old were applied without evasion. As Premise Checker put it, "Had the idea of god not come along until the scientific age, only an exceptionally weird person would invent such an idea and pretend that it explained anything."
And yet skillful scientific specialists, even the major innovators of a field, even in this very day and age, do not apply that skepticism successfully. Nobel laureate Robert Aumann, of Aumann’s Agreement Theorem, is an Orthodox Jew: I feel reasonably confident in venturing that Aumann must, at one point or another, have questioned his faith. And yet he did not doubt successfully. We change our minds less often than we think.
This should scare you down to the marrow of your bones. It means you can be a world-class scientist and conversant with Bayesian mathematics and still fail to reject a belief whose absurdity a fresh-eyed ten-year-old could see. It shows the invincible defensive position which a belief can create for itself, if it has long festered in your mind.
What does it take to defeat an error that has built itself a fortress?
But by the time you know it is an error, it is already defeated. The dilemma is not “How can I reject long-held false belief X?” but “How do I know if long-held belief X is false?” Self-honesty is at its most fragile when we’re not sure which path is the righteous one. And so the question becomes:
How can we create in ourselves a true crisis of faith, that could just as easily go either way?
Religion is the trial case we can all imagine.2 But if you have cut off all sympathy and now think of theists as evil mutants, then you won’t be able to imagine the real internal trials they face. You won’t be able to ask the question:
What general strategy would a religious person have to follow in order to escape their religion?
I’m sure that some, looking at this challenge, are already rattling off a list of standard atheist talking points—“They would have to admit that there wasn’t any Bayesian evidence for God’s existence,” “They would have to see the moral evasions they were carrying out to excuse God’s behavior in the Bible,” “They need to learn how to use Occam’s Razor—”
Wrong! Wrong wrong wrong! This kind of rehearsal, where you just cough up points you already thought of long before, is exactly the style of thinking that keeps people within their current religions. If you stay with your cached thoughts, if your brain fills in the obvious answer so fast that you can't see originally, you surely will not be able to conduct a crisis of faith.
Maybe it’s just a question of not enough people reading Gödel, Escher, Bach at a sufficiently young age, but I’ve noticed that a large fraction of the population—even technical folk—have trouble following arguments that go this meta.3 On my more pessimistic days I wonder if the camel has two humps.
Even when it’s explicitly pointed out, some people seemingly cannot follow the leap from the object-level “Use Occam’s Razor! You have to see that your God is an unnecessary belief!” to the meta-level “Try to stop your mind from completing the pattern the usual way!” Because in the same way that all your rationalist friends talk about Occam’s Razor like it’s a good thing, and in the same way that Occam’s Razor leaps right up into your mind, so too, the obvious friend-approved religious response is “God’s ways are mysterious and it is presumptuous to suppose that we can understand them.” So for you to think that the general strategy to follow is “Use Occam’s Razor,” would be like a theist saying that the general strategy is to have faith.
“But—but Occam’s Razor really is better than faith! That’s not like preferring a different flavor of ice cream! Anyone can see, looking at history, that Occamian reasoning has been far more productive than faith—”
Which is all true. But beside the point. The point is that you, saying this, are rattling off a standard justification that’s already in your mind. The challenge of a crisis of faith is to handle the case where, possibly, our standard conclusions are wrong and our standard justifications are wrong. So if the standard justification for X is “Occam’s Razor!” and you want to hold a crisis of faith around X, you should be questioning if Occam’s Razor really endorses X, if your understanding of Occam’s Razor is correct, and—if you want to have sufficiently deep doubts—whether simplicity is the sort of criterion that has worked well historically in this case, or could reasonably be expected to work, et cetera. If you would advise a religionist to question their belief that “faith” is a good justification for X, then you should advise yourself to put forth an equally strong effort to question your belief that “Occam’s Razor” is a good justification for X.4
If “Occam’s Razor!” is your usual reply, your standard reply, the reply that all your friends give—then you’d better block your brain from instantly completing that pattern, if you’re trying to instigate a true crisis of faith.
Better to think of such rules as, “Imagine what a skeptic would say—and then imagine what they would say to your response—and then imagine what else they might say, that would be harder to answer.”
Or, “Try to think the thought that hurts the most.”
And above all, the rule:
Put forth the same level of desperate effort that it would take for a theist to reject their religion.
Because if you aren’t trying that hard, then—for all you know—your head could be stuffed full of nonsense as bad as religion.
Without a convulsive, wrenching effort to be rational, the kind of effort it would take to throw off a religion—then how dare you believe anything, when Robert Aumann believes in God?
Someone (I forget who) once observed that people had only until a certain age to reject their religious faith. Afterward they would have answers to all the objections, and it would be too late. That is the kind of existence you must surpass. This is a test of your strength as a rationalist, and it is very severe; but if you cannot pass it, you will be weaker than a ten-year-old.
But again, by the time you know a belief is an error, it is already defeated. So we’re not talking about a desperate, convulsive effort to undo the effects of a religious upbringing, after you’ve come to the conclusion that your religion is wrong. We’re talking about a desperate effort to figure out if you should be throwing off the chains, or keeping them. Self-honesty is at its most fragile when we don’t know which path we’re supposed to take—that’s when rationalizations are not obviously sins.
Not every doubt calls for staging an all-out Crisis of Faith. But you should consider it when:
- A belief has long remained in your mind;
- It is surrounded by a cloud of known arguments and refutations;
- You have sunk costs in it (time, money, public declarations);
- The belief has emotional consequences (note this does not make it wrong);
- It has gotten mixed up in your personality generally.
None of these warning signs are immediate disproofs. These attributes place a belief at risk for all sorts of dangers, and make it very hard to reject when it is wrong. And they hold for Richard Dawkins’s belief in evolutionary biology, not just the Pope’s Catholicism.
Nor does this mean that we’re only talking about different flavors of ice cream. Two beliefs can inspire equally deep emotional attachments without having equal evidential support. The point is not to have shallow beliefs, but to have a map that reflects the territory.
I emphasize this, of course, so that you can admit to yourself, “My belief has these warning signs,” without having to say to yourself, “My belief is false.”
But what these warning signs do mark is a belief that will take more than an ordinary effort to doubt effectively. It will take more than an ordinary effort to doubt in such a way that if the belief is in fact false, you will in fact reject it. And where you cannot doubt in this way, you are blind, because your brain will hold the belief unconditionally. When a retina sends the same signal regardless of the photons entering it, we call that eye blind.
When should you stage a Crisis of Faith?
Again, think of the advice you would give to a theist: If you find yourself feeling a little unstable inwardly, but trying to rationalize reasons the belief is still solid, then you should probably stage a Crisis of Faith. If the belief is as solidly supported as gravity, you needn’t bother—but think of all the theists who would desperately want to conclude that God is as solid as gravity. So try to imagine what the skeptics out there would say to your “solid as gravity” argument. Certainly, one reason you might fail at a crisis of faith is that you never really sit down and question in the first place—that you never say, “Here is something I need to put effort into doubting properly.”
If your thoughts get that complicated, you should go ahead and stage a Crisis of Faith. Don’t try to do it haphazardly; don’t try it in an ad-hoc spare moment. Don’t rush to get it done with quickly, so that you can say, “I have doubted, as I was obliged to do.” That wouldn’t work for a theist, and it won’t work for you either. Rest up the previous day, so you’re in good mental condition. Allocate some uninterrupted hours. Find somewhere quiet to sit down. Clear your mind of all standard arguments; try to see from scratch. And make a desperate effort to put forth a true doubt that would destroy a false—and only a false—deeply held belief.
Elements of the Crisis of Faith technique have been scattered over many essays:
- Avoiding Your Belief’s Real Weak Points—One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers. You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
- The Meditation on Curiosity—Roger Zelazny once distinguished between “wanting to be an author” versus “wanting to write,” and there is likewise a distinction between wanting to have investigated and wanting to investigate. It is not enough to say, “It is my duty to criticize my own beliefs”; you must be curious, and only uncertainty can create curiosity. Keeping in mind conservation of expected evidence may help you update yourself incrementally: for every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another. Thus you can be truly curious each time about how it will go.
- Original Seeing—To prevent standard cached thoughts from rushing in and completing the pattern.
- The Litany of Gendlin and the Litany of Tarski—People can stand what is true, for they are already enduring it. If a belief is true, you will be better off believing it, and if it is false, you will be better off rejecting it. You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God. If one cannot come to accept this on a deep emotional level, one will not be able to have a crisis of faith. So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it. Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist’s view of the universe.
- Tsuyoku Naritai!—The drive to become stronger.
- The Genetic Heuristic—You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right.
- The Importance of Saying “Oops”—It really is less painful to swallow the entire bitter pill in one terrible gulp.
- Singlethink—The opposite of doublethink. See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them. If you become aware of what you are not thinking, you can think it.
- Affective Death Spirals and Resist the Happy Death Spiral—Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose. But since affective death spirals can also get started around real things that are genuinely nice, you don’t have to admit that your belief is a lie, to try and resist the halo effect at every point—refuse false praise even of genuinely nice things. Policy debates should not appear one-sided.
- Hold Off On Proposing Solutions—Don’t propose any solutions until the problem has been discussed as thoroughly as possible. Make your mind hold off on knowing what its answer will be; and try for five minutes before giving up—both generally, and especially when pursuing the devil’s point of view.
And these standard techniques, discussed in How to Actually Change Your Mind and Map and Territory, are particularly relevant:
- The sequence on the bottom line and rationalization, which explains why it is always wrong to selectively argue one side of a debate.
- Positive bias, motivated skepticism, and motivated stopping, lest you selectively look for support, selectively look for counter-counterarguments, and selectively stop the argument before it gets dangerous. Missing alternatives are a special case of stopping. A special case of motivated skepticism is fake humility, where you bashfully confess that no one can know something you would rather not know. Don’t selectively demand too much authority of counterarguments.
- Beware of semantic stopsigns, applause lights, and the choice between explaining, worshiping, and ignoring something.
- Feel the weight of burdensome details—each detail a separate burden, a point of crisis.
But really, there’s rather a lot of relevant material, here and on Overcoming Bias. There are ideas I have yet to properly introduce. There is the concept of isshokenmei—the desperate, extraordinary, convulsive effort to be rational. The effort that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never broke free of their faiths.
The Crisis of Faith is only the critical point and sudden clash of the longer isshoukenmei—the lifelong uncompromising effort to be so incredibly rational that you rise above the level of stupid damn mistakes. It’s when you get a chance to use the skills that you’ve been practicing for so long, all-out against yourself.
I wish you the best of luck against your opponent. Have a wonderful crisis!
1See “Occam’s Razor” (in Map and Territory).
2Readers born to atheist parents have missed out on a fundamental life trial, and must make do with the poor substitute of thinking of their religious friends.
3See “Archimedes’s Chromophone” (http://lesswrong.com/lw/h5/archimedess_chronophone) and “Chromophone Motivations” (http://lesswrong.com/lw/h6/chronophone_motivations).
4Think of all the people out there who don’t understand the Minimum Description Length or Solomonoff induction formulations of Occam’s Razor, who think that Occam’s Razor outlaws many-worlds or the simulation hypothesis. They would need to question their formulations of Occam’s Razor and their notions of why simplicity is a good thing. Whatever X in contention you just justified by saying “Occam’s Razor!” is, I bet, not the same level of Occamian slam dunk as gravity.
@Mattew C.
Do you mean by "remote staring experiments" those of Wiseman/Schlitz?
It seems that when properly controlled, they produced no statistically significant effect: http://forums.randi.org/archive/index.php/t-43727.html