Since there's been much questioning of late over "What good is advanced rationality in the real world?", I'd like to remind everyone that it isn't all about post-doctoral-level reductionism.
In particular, as a technique that seems like it ought to be useful in the real world, I exhibit the highly advanced, difficult, multi-component Crisis of Faith aka Reacting To The Damn Evidence aka Actually Changing Your Mind.
Scanning through this post and the list of sub-posts at the bottom (EDIT: copied to below the fold) should certainly qualify it as "extreme rationality" or "advanced rationality" or "x-rationality" or "Bayescraft" or whatever you want to distinguish from "traditional rationality as passed down from Richard Feynman".
An actual sit-down-for-an-hour Crisis of Faith might be something you'd only use once or twice in every year or two, but on important occasions. And the components are often things that you could practice day in and day out, also to positive effect.
I think this is the strongest foot that I could put forward for "real-world" uses of my essays. (Anyone care to nominate an alternative?)
Below the fold, I copy and paste the list of components from the original post, so that we have them at hand:
- Avoiding Your Belief's Real Weak Points - One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers. You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
- The Meditation on Curiosity - Roger Zelazny once distinguished between "wanting to be an author" versus "wanting to write", and there is likewise a distinction between wanting to have investigated and wanting to investigate. It is not enough to say "It is my duty to criticize my own beliefs"; you must be curious, and only uncertainty can create curiosity. Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally: For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another - thus you can be truly curious each time about how it will go.
- Cached Thoughts and Pirsig's Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.
- The Litany of Gendlin and the Litany of Tarski: People can stand what is true, for they are already enduring it. If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it. You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God. If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith. So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it. Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist's view of the universe.
- Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.
- The Genetic Heuristic: You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right. (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)
- The Importance of Saying "Oops" - it really is less painful to swallow the entire bitter pill in one terrible gulp.
- Singlethink, the opposite of doublethink. See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them. If you become aware of what you are not thinking, you can think it.
- Affective Death Spirals and Resist the Happy Death Spiral. Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose. But since affective death spirals can also get started around real things that are genuinely nice, you don't have to admit that your belief is a lie, to try and resist the halo effect at every point - refuse false praise even of genuinely nice things. Policy debates should not appear one-sided.
- Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil's point of view.
- The sequence on The Bottom Line and Rationalization, which explains why it is always wrong to selectively argue one side of a debate.
- Positive Bias and motivated skepticism and motivated stopping, lest you selectively look for support, selectively look for counter-counterarguments, and selectively stop the argument before it gets dangerous. Missing alternatives are a special case of stopping. A special case of motivated skepticism is fake humility where you bashfully confess that no one can know something you would rather not know. Don't selectively demand too much authority of counterarguments.
- Beware of Semantic Stopsigns, Applause Lights, and the choice to Explain/Worship/Ignore.
- Feel the weight of Burdensome Details; each detail a separate burden, a point of crisis.
I didn't say you didn't have good rationalist techniques. I said I don't know whether good rationalist techniques lead to practical benefits.
Most smart people already have a naive version of this technique: that when all the evidence is going against them, they need to stop and think about whether their beliefs are right or wrong. To show that Crisis of Faith is practically valuable, you need evidence that:
(1) EITHER people don't apply the naive technique often enough in situations where it could give practical real-world benefits, and formalizing it will convince them to do it more often, (2) OR that the specific advice you give in the Crisis of Faith post makes a full-blown Crisis of Faith more likely to return the correct answer than the naive technique. (3) AND that once they finish Crisis of Faith, they go through with their decision.
I give (1) low probability. People don't change their religious or political views often enough, but they're often good at changing their practical situations. I've heard many people tell stories of how they stayed up all night agonizing over whether or not to break up with a girlfriend. In many cases I think the difficulty is in reaching the point where you admit "I need to seriously reconsider this." I doubt many people reach a point where they feel uncomfortable about their position on a practical issue but don't take any time to think it over. And the people who would be interested in rationalism are probably exactly the sort of people who currently use the naive technique most often already. I used a naive version of Crisis of Faith for a very important decision in my life before reading OB.
I give (2) high but not overwhelming probability. Yes, all of this ought to work. But I was reading up on evidence-based medicine last night in response to your comment, and one thing that struck me the most was that "ought to work" is a very suspicious phrase. Doctors possessing mountains of accurate information about liver function can say "From what we know about the liver, it would be absolutely absurd for this chemical not to cure liver disease" and then they do a study and it doesn't help at all. With our current state of knowledge and the complexity of the subject, it's easier to make mistakes about rationality than about the liver. Yes, my completely non-empirical gut feeling is that the specific components of Crisis of Faith should work better than just sitting and thinking, but maybe anyone unbiased enough to succeed at Crisis of Faith is unbiased enough to succeed at the naive method. Maybe Crisis of Faith creates too much pressure to reject your old belief in order to signal rationality, even if the old belief was correct.
I give (3) medium probability.
But all this is an upper bound. The question not considered is whether some specific training program will teach people to use Crisis of Faith regularly and correctly. I predict that the "training program" of reading about it on Overcoming Bias in most cases does not, but this is easy to test:
Everyone please comment below if (how many times?) you've actually used the full-blown formal Crisis of Faith technique in a situation where you wouldn't have questioned something if you hadn't read the Crisis of Faith article. Please also mention whether it was about a practical real-world matter, and whether you ended up changing your mind
A more formal training program might do better, but that this would still be the most serious bottleneck.