After thinking on this for a while, here are my thoughts. This should probably be a new post but I don't want to start another whole chain of discussions on this issue.
I had the belief that many people on Less Wrong believed that our currently existing Art of Rationality was sufficient or close to sufficient to guarantee practical success or even to transform its practioner into an ubermensch like John Galt. I'm no longer sure anyone believes this. If they do, they are wrong. If anyone right now claims they participate in Less Wrong solely out of a calculated program to maximize practical benefits and not because they like rationality, I think they are deluded.
Where x-rationality is defined as "formal, math-based rationality", there are many cases of x-rationality being used for good practical effect. I missed these because they look more like three percent annual gains in productivity than like Brennan discovering quantum gravity or Napoleon conquering Europe. For example, doctors can use evidence-based medicine to increase their cure rate.
The doctors who invented evidence-based medicine deserve our praise. Eliezer is willing to consider them x-rationalists. But there is no evidence that they took a particularly philosophical view towards rationality, as opposed to just thinking "Hey, if we apply these tests, it will improve medicine a bit." Depending on your view of socialism, the information that one of these inventors ran for parliament on a socialist platform may be an interesting data point.
These doctors probably had mastery of statistics, good understanding of the power of the experimental method, and a belief that formalizing things could do better than normal human expertise. All of these are rationalist virtues. Any new doctor who starts their career with these virtues will be in a better position to profit from and maybe expand upon evidence-based medicine than a less virtuous doctor, and will reap great benefits from their virtues. Insofar as Less Wrong's goal is to teach people to become such doctors, this is great...
...except that epidemiology and statistics classes teach the same thing with a lot less fuss. Less Wrong's goal seems to be much higher. Less Wrong wants a doctor who can do that, and understand their mental processes in great detail, and who will be able to think rationally about politics and religion and turn the whole thing into a unified rationalist outlook.
Or maybe it doesn't. Eliezer has already explained that a lot of his OB writing was just stuff that he came across trying to solve AI problems. Maybe this has turned us into a community of people who like talking about philosophy, and that really doesn't matter much and shouldn't be taught at rationality dojos. Maybe a rationality dojo should be an extra-well-taught applied statistics class and some discussion of important cognitive biases and how to avoid them. It seems to me that a statistics class plus some discussion of cognitive biases would be enough to transform an average doctor into the kind of doctor who could invent or at least use evidence-based medicine and whatever other x-rationality techniques might be useful in medicine. With a few modifications, the same goes for business, science, and any other practical field.
I predict the marginal utility of this sort of rationality will decline quickly. The first year of training will probably do wonders. The second year will be less impressive. I doubt a doctor who studies this rationality for ten years will be noticeably better off than one who studies it for five, although this may be my pessimism speaking. Probably the doctor would be better off spending those second five years studying some other area of medicine. In the end, I predict these kinds of classes could improve performance in some fields 10-20% for people who really understood them.
This would be a useful service, but it wouldn't have the same kind of awesomeness as Overcoming Bias did. There seems to be a second movement afoot here, one to use rationality to radically transform our lives and thought processes, moving so far beyond mere domain-specific reasoning ability that even in areas like religion, politics, morality, and philosophy we hold only rational beliefs and are completely inhospitable to any irrational thoughts. This is a very different sort of task.
This new level of rationality has benefits, but they are less practical. There are mental clarity benefits, and benefits to society when we stop encouraging harmful political and social movements, and benefits to the world when we give charity more efficiently. Once people finish the course mentioned in (6) and start on the course mentioned in (8), it seems less honest to keep telling them about the vast practical benefits they will attain.
This might have certain social benefits, but you would have to be pretty impressive for conscious-level social reasoning to get better than the dedicated unconscious modules we already use for that task.
I have a hard time judging opinion here, but it does seem like some people think sufficient study of z-rationality can turn someone into an ubermensch. But the practical benefits beyond those offered by y-rationality seem low. I really like z-rationality, but only because I think it's philosophically interesting and can improve society, not because I think it can help me personally.
In the original post, I was using x-rationality in a confused way, but I think to some degree I was thinking of (8) rather than (6).
Related to: Individual Rationality is a Matter of Life and Death, The Benefits of Rationality, Rationality is Systematized Winning
But I finally snapped after reading: Mandatory Secret Identities
Okay, the title was for shock value. Rationality is pretty great. Just not quite as great as everyone here seems to think it is.
For this post, I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training." It seems pretty uncontroversial that there are massive benefits from going from a completely irrational moron to the average intelligent person's level. I'm coining this new term so there's no temptation to confuse x-rationality with normal, lower-level rationality.
And for this post, I use "benefits" or "practical benefits" to mean anything not relating to philosophy, truth, winning debates, or a sense of personal satisfaction from understanding things better. Money, status, popularity, and scientific discovery all count.
So, what are these "benefits" of "x-rationality"?
A while back, Vladimir Nesov asked exactly that, and made a thread for people to list all of the positive effects x-rationality had on their lives. Only a handful responded, and most responses weren't very practical. Anna Salamon, one of the few people to give a really impressive list of benefits, wrote:
There have since been a few more people claiming practical benefits from x-rationality, but we should generally expect more people to claim benefits than to actually experience them. Anna mentions the placebo effect, and to that I would add cognitive dissonance - people spent all this time learning x-rationality, so it MUST have helped them! - and the same sort of confirmation bias that makes Christians swear that their prayers really work.
I find my personal experience in accord with the evidence from Vladimir's thread. I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines1, I can't think of any.
Looking over history, I do not find any tendency for successful people to have made a formal study of x-rationality. This isn't entirely fair, because the discipline has expanded vastly over the past fifty years, but the basics - syllogisms, fallacies, and the like - have been around much longer. The few groups who made a concerted effort to study x-rationality didn't shoot off an unusual number of geniuses - the Korzybskians are a good example. In fact as far as I know the only follower of Korzybski to turn his ideas into a vast personal empire of fame and fortune was (ironically!) L. Ron Hubbard, who took the basic concept of techniques to purge confusions from the mind, replaced the substance with a bunch of attractive flim-flam, and founded Scientology. And like Hubbard's superstar followers, many of this century's most successful people have been notably irrational.
There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it. The evidence in favor of the proposition right now seems to be its sheer obviousness. Rationality is the study of knowing the truth and making good decisions. How the heck could knowing more than everyone else and making better decisions than them not make you more successful?!?
This is a difficult question, but I think it has an answer. A complex, multifactorial answer, but an answer.
One factor we have to once again come back to is akrasia2. I find akrasia in myself and others to be the most important limiting factor to our success. Think of that phrase "limiting factor" formally, the way you'd think of the limiting reagent in chemistry. When there's a limiting reagent, it doesn't matter how much more of the other reagents you add, the reaction's not going to make any more product. Rational decisions are practically useless without the willpower to carry them out. If our limiting reagent is willpower and not rationality, throwing truckloads of rationality into our brains isn't going to increase success very much.
This is a very large part of the story, but not the whole story. If I was rational enough to pick only stocks that would go up, I'd become successful regardless of how little willpower I had, as long as it was enough to pick up the phone and call my broker.
So the second factor is that most people are rational enough for their own purposes. Oh, they go on wild flights of fancy when discussing politics or religion or philosophy, but when it comes to business they suddenly become cold and calculating. This relates to Robin Hanson on Near and Far modes of thinking. Near Mode thinking is actually pretty good at a lot of things, and Near Mode thinking is the thinking whose accuracy gives us practical benefits.
And - when I was young, I used to watch The Journey of Allen Strange on Nickleodeon. It was a children's show about this alien who came to Earth and lived with these kids. I remember one scene where Allen the Alien was watching the kids play pool. "That's amazing," Allen told them. "I could never calculate differential equations in my head that quickly." The kids had to convince him that "it's in the arm, not the head" - that even though the movement of the balls is governed by differential equations, humans don't actually calculate the equations each time they play. They just move their arm in a way that feels right. If Allen had been smarter, he could have explained that the kids were doing some very impressive mathematics on a subconscious level that produced their arm's perception of "feeling right". But the kids' point still stands; even though in theory explicit mathematics will produce better results than eyeballing it, in practice you can't become a good pool player just by studying calculus.
A lot of human rationality follows the same pattern. Isaac Newton is frequently named as a guy who knew no formal theories of science or rationality, who was hopelessly irrational in his philosophical beliefs and his personal life, but who is still widely and justifiably considered the greatest scientist who ever lived. Would Newton have gone even further if he'd known Bayes theory? Probably it would've been like telling the world pool champion to try using more calculus in his shots: not a pretty sight.
Yes, yes, beisutsukai should be able to develop quantum gravity in a month and so on. But until someone on Less Wrong actually goes and does it, that story sounds a lot like when Alfred Korzybski claimed that World War Two could have been prevented if everyone had just used more General Semantics.
And then there's just plain noise. Your success in the world depends on things ranging from your hairstyle to your height to your social skills to your IQ score to cognitive constructs psychologists don't even have names for yet. X-Rationality can help you succeed. But so can excellent fashion sense. It's not clear in real-world terms that x-rationality has more of an effect than fashion. And don't dismiss that with "A good x-rationalist will know if fashion is important, and study fashion." A good normal rationalist could do that too; it's not a specific advantage of x-rationalism, just of having a general rational outlook. And having a general rational outlook, as I mentioned before, is limited in its effectiveness by poor application and akrasia.
I no longer believe mastering all these Overcoming Bias and Less Wrong techniques will turn me into Anasûrimbor Kellhus or John Galt. I no longer even believe mastering all these Overcoming Bias techniques will turn me into Eliezer Yudkowsky (who, as his writings from 2001 indicate, had developed his characteristic level of awesomeness before he became interested in x-rationality at all)3. I think it may help me succeed in life a little, but I think the correlation between x-rationality and success is probably closer to 0.1 than to 1. Maybe 0.2 in some businesses like finance, but people in finance tend to know this and use specially developed x-rationalist techniques on the job already without making it a lifestyle commitment. I think it was primarily a Happy Death Spiral around how wonderfully super-awesome x-rationality was that made me once think otherwise.
And this is why I am not so impressed by Eliezer's claim that an x-rationality instructor should be successful in their non-rationality life. Yes, there probably are some x-rationalists who will also be successful people. But again, correlation 0.1. Stop saying only practically successful people could be good x-rationality teachers! Stop saying we need to start having huge real-life victories or our art is useless! Stop calling x-rationality the Art of Winning! Stop saying I must be engaged in some sort of weird signalling effort for saying I'm here because I like mental clarity instead of because I want to be the next Bill Gates! It trivializes the very virtues that brought most of us to Overcoming Bias, and replaces them with what sounds a lot like a pitch for some weird self-help cult...
...
...
...but you will disagree with me. And we are both aspiring rationalists, and therefore we resolve disagreements by experiments. I propose one.
For the next time period - a week, a month, whatever - take special note of every decision you make. By "decision", I don't mean the decision to get up in the morning, I mean the sort that's made on a conscious level and requires at least a few seconds' serious thought. Make a tick mark, literal or mental, so you can count how many of these there are.
Then note whether you make that decision rationally. If yes, also record whether you made that decision x-rationally. I don't just mean you spent a brief second thinking about whether any biases might have affected your choice. I mean one where you think there's a serious (let's arbitrarily say 33%) chance that using x-rationality instead of normal rationality actually changed the result of your decision.
Finally, note whether, once you came to the rational conclusion, you actually followed it. This is not a trivial matter. For example, before writing this blog post I wondered briefly whether I should use the time studying instead, used normal (but not x-) rationality to determine that yes, I should, and then proceeded to write this anyway. And if you get that far, note whether your x-rational decisions tend to turn out particularly well.
This experiment seems easy to rig4; merely doing it should increase your level of conscious rational decisions quite a bit. And yet I have been trying it for the past few days, and the results have not been pretty. Not pretty at all. Not only do I make fewer conscious decisions than I thought, but the ones I do make I rarely apply even the slightest modicum of rationality to, and the ones I apply rationality to it's practically never x-rationality, and when I do apply everything I've got I don't seem to follow those decisions too consistently.
I'm not so great a rationalist anyway, and I may be especially bad at this. So I'm interested in hearing how different your results are. Just don't rig it. If you find yourself using x-rationality twenty times more often than you were when you weren't performing the experiment, you're rigging it, consciously or otherwise5.
Eliezer writes:
Yet one way to fail your Art is to expect more of it than it can deliver. No matter how good a swimmer you are, you will not be able to cross the Pacific. This is not to say crossing the Pacific is impossible. It just means it will require a different sort of thinking than the one you've been using thus far. Perhaps there are developments of the Art of Rationality or its associated Arts that can turn us into a Kellhus or a Galt, but they will not be reached by trying to overcome biases really really hard.
Footnotes:
1: Specifically, reading Overcoming Bias convinced me to study evolutionary psychology in some depth, which has been useful in social situations. As far as I know. I'd probably be biased into thinking it had been even if it hadn't, because I like evo psych and it's very hard to measure.
2: Eliezer considers fighting akrasia to be part of the art of rationality; he compares it to "kicking" to our "punching". I'm not sure why he considers them to be the same Art rather than two related Arts.
3: This is actually an important point. I think there are probably quite a few smart, successful people who develop an interest in x-rationality, but I can't think of any people who started out merely above-average, developed an interest in x-rationality, and then became smart and successful because of that x-rationality.
4: This is a terribly controlled experiment, and the only way its data can be meaningfully interpreted at all is through what one of my professors called the "ocular trauma test" - when the data hits you between the eyes. If people claim they always follow their rational decisions, I think I will be more likely to interpret it as lack of enough cognitive self-consciousness to notice when they're doing something irrational than an honest lack of irrationality.
5: In which case it will have ceased to be an experiment and become a technique instead. I've noticed this happening a lot over the past few days, and I may continue doing it.