Since there's been much questioning of late over "What good is advanced rationality in the real world?", I'd like to remind everyone that it isn't all about post-doctoral-level reductionism.

In particular, as a technique that seems like it ought to be useful in the real world, I exhibit the highly advanced, difficult, multi-component Crisis of Faith aka Reacting To The Damn Evidence aka Actually Changing Your Mind.

Scanning through this post and the list of sub-posts at the bottom (EDIT: copied to below the fold) should certainly qualify it as "extreme rationality" or "advanced rationality" or "x-rationality" or "Bayescraft" or whatever you want to distinguish from "traditional rationality as passed down from Richard Feynman".

An actual sit-down-for-an-hour Crisis of Faith might be something you'd only use once or twice in every year or two, but on important occasions.  And the components are often things that you could practice day in and day out, also to positive effect.

I think this is the strongest foot that I could put forward for "real-world" uses of my essays.  (Anyone care to nominate an alternative?)

Below the fold, I copy and paste the list of components from the original post, so that we have them at hand:

  • Avoiding Your Belief's Real Weak Points - One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers.  You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
  • The Meditation on Curiosity - Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write", and there is likewise a distinction between wanting to have investigated and wanting to investigate.  It is not enough to say "It is my duty to criticize my own beliefs"; you must be curious, and only uncertainty can create curiosity.  Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally:  For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another - thus you can be truly curious each time about how it will go.
  • Cached Thoughts and Pirsig's Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.
  • The Litany of Gendlin and the Litany of Tarski:  People can stand what is true, for they are already enduring it.  If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.  You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God.  If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith.  So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it.  Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist's view of the universe.
  • Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.
  • The Genetic Heuristic:  You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right.  (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)
  • The Importance of Saying "Oops" - it really is less painful to swallow the entire bitter pill in one terrible gulp.
  • Singlethink, the opposite of doublethink.  See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them.  If you become aware of what you are not thinking, you can think it.
  • Affective Death Spirals and Resist the Happy Death Spiral.  Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose.  But since affective death spirals can also get started around real things that are genuinely nice, you don't have to admit that your belief is a lie, to try and resist the halo effect at every point - refuse false praise even of genuinely nice things.  Policy debates should not appear one-sided.
  • Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil's point of view.
  • The sequence on The Bottom Line and Rationalization, which explains why it is always wrong to selectively argue one side of a debate.
  • Positive Bias and motivated skepticism and motivated stopping, lest you selectively look for support, selectively look for counter-counterarguments, and selectively stop the argument before it gets dangerous.  Missing alternatives are a special case of stopping.  A special case of motivated skepticism is fake humility where you bashfully confess that no one can know something you would rather not know.  Don't selectively demand too much authority of counterarguments.
  • Beware of Semantic Stopsigns, Applause Lights, and the choice to  Explain/Worship/Ignore.
  • Feel the weight of Burdensome Details; each detail a separate burden, a point of crisis.

New to LessWrong?

New Comment
7 comments, sorted by Click to highlight new comments since: Today at 12:31 PM

I hope you didn't interpret my post Wednesday as saying that nothing you wrote was useful. My only gripe was that people seemed to be talking in terms of "this is absolutely certain to change the world and transform us all into ubermenschen!" and that we should start off more sober. Or maybe no one was really talking that way, and I was misinterpreting people's deliberate hyperbole in terms of my own Happy Death Spiral. But that was all I was arguing against. Thus the admission that rationality should have a .1 correlation with success, and the comment that "Good use of rationality will look more like three percent productivity gain than Napoleon conquering Europe".

I think Crisis of Faith can be good for certain situations, but I am skeptical about it being completely game-changing for a few reasons.

Most smart people already have a naive version of this technique: that when all the evidence is going against them, they need to stop and think about whether their beliefs are right or wrong. For Crisis of Faith to be practically valuable, you need lots of cases where:

(1) EITHER people don't apply the naive technique often enough in situations where it could give practical real-world benefits, and formalizing it will convince them to do it more often,
(2) OR the specific advice you give in the Crisis of Faith post makes a full-blown Crisis of Faith more likely to return the correct answer than the naive technique.
(3) AND once they finish Crisis of Faith, they go through with their decision.

I give (1) low probability. People don't change their religious or political views often enough, but they're often good at changing their practical situations. I've heard many people tell stories of how they stayed up all night agonizing over whether or not to break up with a girlfriend. In many cases I think the difficulty is in reaching the point where you admit "I need to seriously reconsider this." I doubt many people reach a point where they feel uncomfortable about their position on a practical issue but don't take any time to think it over. And the people who would be interested in rationalism are probably exactly the sort of people who currently use the naive technique most often already. I used a naive version of Crisis of Faith for a very important decision in my life before reading OB.

I give (2) high but not overwhelming probability. Yes, all of this ought to work. But I was reading up on evidence-based medicine last night in response to your comment, and one thing that struck me the most was that "ought to work" is a very suspicious phrase. Doctors possessing mountains of accurate information about liver function can say "From what we know about the liver, it would be absolutely absurd for this chemical not to cure liver disease" and then they do a study and it doesn't help at all. With our current state of knowledge and the complexity of the subject, it's easier to make mistakes about rationality than about the liver. Yes, my completely non-empirical gut feeling is that the specific components of Crisis of Faith should work better than just sitting and thinking, but maybe anyone unbiased enough to succeed at Crisis of Faith is unbiased enough to succeed at the naive method. Maybe Crisis of Faith creates too much pressure to reject your old belief in order to signal rationality, even if the old belief was correct.

I give (3) medium probability.

And all this is an upper bound anyway. The next question is whether some specific training program will teach people to use Crisis of Faith regularly and correctly. I predict that the "training program" of reading about it on Overcoming Bias in most cases does not, but this is easy to test:

Everyone please comment below if (how many times?) you've actually used the formal Crisis of Faith technique in a situation where you wouldn't have questioned something if you hadn't read the Crisis of Faith article. Please also mention whether it was about a practical real-world matter, and whether you ended up changing your mind

Again, not saying this to prove Crisis of Faith is worthless, just to show that there are factors to be considered beyond its raw value as a technique.

I talked to someone at an OB meetup who did change his mind about something important, by use of the full-fledged crisis of faith, with his use occasioned by Eliezer's post. It impacted his practical actions in significant ways. If each of us does one of these every three years, with something equally important, the technique will be paying off hugely.

For myself: I went to a coffee shop shortly after Eliezer's post, with a free afternoon and intent to try the technique. No particular subject matter. I started by making a list of "topics potentially warranting crises of faith", which itself probably made me more aware of existing gaps in my thinking. Then I picked the most emotionally difficult topic from my list, got a bit of the way in... and changed my mind about whether I wanted to think that one through. Then I picked another topic (also a practical, real-world matter) and... got partway through, farther than above, with some but not huge change to my beliefs and practices. I should try the procedure again.

Like Yvain, I'd love to hear others' experiences with the Crisis of Faith technique, positive or negative.

Just to add more data to my experience: I broke the technique into steps when I tried it. The steps did seem helpful. But maybe someone who used the technique more successfully could give us better lines of approach. The steps I used (I wrote out answers to each step, since writing or speaking aloud to myself makes it easier to follow out difficult lines of thought):

Step 1: Pick a question.

Step 2: Find a reason to care about having actually an accurate answer. Find something to protect that hinges on believing whatever it is that’s accurate about this question, whether or not that accurate answer turns out to be my current belief.

Step 3: Notice any reasons I might want to stick to my current belief, even if that belief turns out to be untrue. See if they in fact outweigh the reasons to want the actual answer.

Step 4: Create doubt or curiosity: Find my current belief’s weakest points, or the points I am most afraid to consider. Go through a list of outside people I respect, and ask what points they might balk at in my beliefs. Ask if my belief has anything in common with past errors I’ve made, or if an uncharitable stranger might think so. Ask if anything I'm saying to myself makes me feel squicky. Brainstorm. Ask what the space of alternatives might look like.

Step 5: Actually do the Crisis of Faith technique. [Except that I didn't get to this step.]

Personally, the idea of the crisis of faith seems to me like a symptom of bad thinking. If you hold an idea so tight that you need a crisis of faith, then maybe the real problem is that you have any belief held so strongly that it takes that much effort to overcome it. Aside from human-level stupidity in-the-moment of discussion (e.g. supporting a position despite contrary evidence because it's the side you picked at the start), a master rationalist shouldn't need to have a crisis of faith.

I had the last thing that felt like a crisis of faith to me when I was 18. Since then I haven't been able to hold onto any belief so tightly that any normal amount of rationality effort hasn't been able to change my mind.

That would be all very well for master rationalists so expert that they have no beliefs that might require a crisis of faith. I don't happen to know any of those; do you? I would be skeptical about anyone (yourself included; I hope you aren't offended) who claims to have none: how do you know you aren't merely failing to notice some tightly-held beliefs?

Fair enough. First, master rationalist is probably pushing it a bit too far in what's required; rather, you just need to work towards a "mind like water" state, flowing in whichever way the evidence directs it, and once you get close enough to that state the need for a true crisis of faith should disappear.

As for myself and and others who might make a claim to have no crises of faith, it's a fair question to ask if we're simply not seeing them. It's entirely possible that there are some beliefs that I have that I am unaware are so tightly held that I don't even see them as beliefs, but as truths about the world. However, I have had many experiences which you might identify as a crisis of faith (although I didn't handle any of them with much in the way of rationality, and it was luck as much as anything else that I was dumped out in the state of mind that I was in), and I have not since encountered anything that has led me to a crisis of faith. Given the huge amount I have learned since that last crisis of faith, I consider the odds of me having another one low.

There is one caveat I should mention, though: this whole issue may come down to a matter of perspective. To me, it's not a real crisis of faith unless you have to change your entire world view. Up until my last crisis of faith, I spent a lot of time thinking about how everything fit together in the universe. But then I had the realization, which came over me like a wave but soaked in to me very slowly, that all that really mattered was the evidence. So compared to that experience, nothing else has felt worthy of being called a crisis of faith.

That said, the method that Eliezer outlines seems to be a good one to follow. I have applied several of the techniques described with good success. So now that I think of it, maybe it is all just a matter of differences in what we really consider to be a crisis.

[-][anonymous]15y10

I didn't say you didn't have good rationalist techniques. I said I don't know whether good rationalist techniques lead to practical benefits.

Most smart people already have a naive version of this technique: that when all the evidence is going against them, they need to stop and think about whether their beliefs are right or wrong. To show that Crisis of Faith is practically valuable, you need evidence that:

(1) EITHER people don't apply the naive technique often enough in situations where it could give practical real-world benefits, and formalizing it will convince them to do it more often, (2) OR that the specific advice you give in the Crisis of Faith post makes a full-blown Crisis of Faith more likely to return the correct answer than the naive technique. (3) AND that once they finish Crisis of Faith, they go through with their decision.

I give (1) low probability. People don't change their religious or political views often enough, but they're often good at changing their practical situations. I've heard many people tell stories of how they stayed up all night agonizing over whether or not to break up with a girlfriend. In many cases I think the difficulty is in reaching the point where you admit "I need to seriously reconsider this." I doubt many people reach a point where they feel uncomfortable about their position on a practical issue but don't take any time to think it over. And the people who would be interested in rationalism are probably exactly the sort of people who currently use the naive technique most often already. I used a naive version of Crisis of Faith for a very important decision in my life before reading OB.

I give (2) high but not overwhelming probability. Yes, all of this ought to work. But I was reading up on evidence-based medicine last night in response to your comment, and one thing that struck me the most was that "ought to work" is a very suspicious phrase. Doctors possessing mountains of accurate information about liver function can say "From what we know about the liver, it would be absolutely absurd for this chemical not to cure liver disease" and then they do a study and it doesn't help at all. With our current state of knowledge and the complexity of the subject, it's easier to make mistakes about rationality than about the liver. Yes, my completely non-empirical gut feeling is that the specific components of Crisis of Faith should work better than just sitting and thinking, but maybe anyone unbiased enough to succeed at Crisis of Faith is unbiased enough to succeed at the naive method. Maybe Crisis of Faith creates too much pressure to reject your old belief in order to signal rationality, even if the old belief was correct.

I give (3) medium probability.

But all this is an upper bound. The question not considered is whether some specific training program will teach people to use Crisis of Faith regularly and correctly. I predict that the "training program" of reading about it on Overcoming Bias in most cases does not, but this is easy to test:

Everyone please comment below if (how many times?) you've actually used the full-blown formal Crisis of Faith technique in a situation where you wouldn't have questioned something if you hadn't read the Crisis of Faith article. Please also mention whether it was about a practical real-world matter, and whether you ended up changing your mind

A more formal training program might do better, but that this would still be the most serious bottleneck.