I often use the metaphor that rationality is the martial art of mind. You don’t need huge, bulging muscles to learn martial arts—there’s a tendency toward more athletic people being more likely to learn martial arts, but that may be a matter of enjoyment as much as anything else. If you have a hand, with tendons and muscles in the appropriate places, then you can learn to make a fist.

Similarly, if you have a brain, with cortical and subcortical areas in the appropriate places, you might be able to learn to use it properly. If you’re a fast learner, you might learn faster—but the art of rationality isn’t about that; it’s about training brain machinery we all have in common. And where there are systematic errors human brains tend to make—like an insensitivity to scope—rationality is about fixing those mistakes, or finding work-arounds.

Alas, our minds respond less readily to our will than our hands. Our ability to control our muscles is evolutionarily ancient; our ability to reason about our own reasoning processes is a much more recent innovation. We shouldn’t be surprised, then, that muscles are easier to use than brains. But it is not wise to neglect the latter training because it is more difficult. It is not by bigger muscles that the human species rose to prominence upon Earth.

If you live in an urban area, you probably don’t need to walk very far to find a martial arts dojo. Why aren’t there dojos that teach rationality?

One reason, perhaps, is that it’s harder to verify skill. To rise a level in Tae Kwon Do, you might need to break a board of a certain width. If you succeed, all the onlookers can see and applaud. If you fail, your teacher can watch how you shape a fist, and check if you shape it correctly. If not, the teacher holds out a hand and makes a fist correctly, so that you can observe how to do so.

Within martial arts schools, techniques of muscle have been refined and elaborated over generations. Techniques of rationality are harder to pass on, even to the most willing student.

Very recently—in just the last few decades—the human species has acquired a great deal of new knowledge about human rationality. The most salient example would be the heuristics and biases program in experimental psychology. There is also the Bayesian systematization of probability theory and statistics; evolutionary psychology; social psychology. Experimental investigations of empirical human psychology; and theoretical probability theory to interpret what our experiments tell us; and evolutionary theory to explain the conclusions. These fields give us new focusing lenses through which to view the landscape of our own minds. With their aid, we may be able to see more clearly the muscles of our brains, the fingers of thought as they move. We have a shared vocabulary in which to describe problems and solutions. Humanity may finally be ready to synthesize the martial art of mind: to refine, share, systematize, and pass on techniques of personal rationality.

Such understanding as I have of rationality, I acquired in the course of wrestling with the challenge of artificial general intelligence (an endeavor which, to actually succeed, would require sufficient mastery of rationality to build a complete working rationalist out of toothpicks and rubber bands). In most ways the AI problem is enormously more demanding than the personal art of rationality, but in some ways it is actually easier. In the martial art of mind, we need to acquire the realtime procedural skill of pulling the right levers at the right time on a large, pre-existing thinking machine whose innards are not end-user-modifiable. Some of the machinery is optimized for evolutionary selection pressures that run directly counter to our declared goals in using it. Deliberately we decide that we want to seek only the truth; but our brains have hardwired support for rationalizing falsehoods. We can try to compensate for what we choose to regard as flaws of the machinery; but we can’t actually rewire the neural circuitry. Nor may martial artists plate titanium over their bones—not today, at any rate.

Trying to synthesize a personal art of rationality, using the science of rationality, may prove awkward: One imagines trying to invent a martial art using an abstract theory of physics, game theory, and human anatomy.

But humans arent reflectively blind. We do have a native instinct for introspection. The inner eye isnt sightless, though it sees blurrily, with systematic distortions. We need, then, to apply the science to our intuitions, to use the abstract knowledge to correct our mental movements and augment our metacognitive skills.

We aren't writing a computer program to make a string puppet execute martial arts forms; it is our own mental limbs that we must move. Therefore we must connect theory to practice. We must come to see what the science means, for ourselves, for our daily inner life.

New Comment
48 comments, sorted by Click to highlight new comments since:

To continue with this metaphor, it seems what we need is a good set of problems to test our rationality (i.e., ability to resist common biases). As with any cognitive test, the details of the test can't be known in advance, or people would just memorize answers instead of developing skills. So we need to collect a rather large set of good test questions, or create a generators to create sets of them automatically. Not easy, but perhaps worth doing.

It seems that calibrating personal judgements using decision markets could in principle be a way to measure rationality. You make a guess, and if your guess is close to what the market predicts, you can be slightly more confident about yourself. Of course, for this to work, you'd have to avoid prior knowledge of the market, and skip participating in betting on topics that you want to consider unassisted first.

Great post btw.

Michael, you can't have a prediction market without a way to pay off the bets, and if you have that you can measure personal accuracy directly, if you just wait.

Robin, I agree that the main difficulty is figuring out how to pay off the bets, but it seems to me that - given such a measure - playing a prediction market around the measure makes the game more complex, and hopefully more of a lesson, and more socially involving and personally intriguing. In other words, it's the difference between "Guess whether it will rain tomorrow?" and "Bob is smiling evilly; are you willing to bet $50 that his probability estimate of 36.3% is too low?" Or to look at it another way, fewer people would play poker if the whole theme was just "Estimate the probability that you can fill an inside straight." I think Anissimov has a valid fun-amplifying suggestion here.

Psychologists have developed a self-deception test, which includes questions like "Did you ever hate your mother," on the assumption that most everyone did but few want to admit it. See:

Paulhus, Delroy L. "Self-deception and Impression Management in Test Responses." In Angleitner, A. & Wiggins, J. S., Personality assessment Via Questionnaires. New York, NY: Springer, 1986, 143-165.

Perhaps questions like those might be a good part of a rationality test and practice regime .

I'm not so sure. Such "probabilistic" tests are good for aggregate testing, but not for personal testing. We want to minimise false positives and false negatives.

Why aren't there dojos that teach rationality?

They're called universities. My PhD adviser taught me to think rationally and caught many of the errors in my thinking, including those caused by biases. This eventually allowed me to catch many of my own errors.

No, it wasn't perfect. It was limited to one academic area and was not explicitly aimed at teaching rationality and overcoming biases. (Then again, martial-arts senseis do not teach their skills "explicitly", but by repetition and training.) Moreover, gaining subject-matter expertise took as much of the effort as learning rational thinking processes. But a PhD in a rigorous field from a good university is as close as you'll find to a "color belt" (dare I say "black belt"?) in rationality.

Joshua, the thought had occurred to me, but with all due respect to universities, that's the same sort of training-in-passing that you get from reading "Surely You're Joking, Mr. Feynman" as a kid. It's not systematic, and it's not grounded in the recent advances in cognitive psychology or probability theory. If we continue with the muscle metaphor, then I would say that - if judged by the sole criterion of improving personal skills of rationality - then studying physics is the equivalent of playing tennis. If you actually want to do physics, of course, that's a whole separate issue. But if you want to study rationality, you really have to study rationality; just as if you wanted to study physics it wouldn't do to go off and study psychology instead.

I think you are accepting the advertising hooey of martial arts dojos too much at face value. Most people don't take martial arts for all the abstracted reasons that appear in the brochures, but because they want to feel they can win fights with other people.

Similarly, people don't want to learn how to be rational for the sake of being rational. Instead, you have to sell rationality for more human ends, such as being able to win arguments, or make money, or understand a particular field such as baseball statistics. You can learn a lot of general lessons about rationality from reading Bill James on baseball statistics, but it's not very exciting to study rationality for the sake of being rational.

I did judo as a child. At first it was because my parents thought I'd develop physical self-confidence that way. It worked (and judo's something I'd recommend to any boy), but it doesn't take much training before you can win fights against most people. After that you stay interested because you want to beat other people at judo, just like any other sport.

So you might sell your rationality dojos as being about winning arguments, but pretty soon I imagine that the practitioners might get more interested in being right.

Most people will get very competitive the minute they've got something they can measure. Is there any form of transport which isn't raced? And most non-competitive people will get very competitive the moment they're in a fight they can win.

Suddenly I'm imagining a room where the sensei is describing the Amanda Knox case, and the students are asking him questions and debating with each other, and after all the talking is done, people place and take bets on the result at various odds, and then the sensei reveals the actual answer and the correct reasoning. And ranking points are transferred accordingly, and there is a ladder on the wall where names are listed in order with the scores.

I already reckon I could seriously enjoy such a game. Who wants to play?

I'm in as long as we don't discuss the Amanda Knox case.

I mentioned it only because it seems to have been a unique triumph for Less Wrong. I'd read about the case, and thought nothing of it particularly. And then people here started saying "Look at it from a probabilistic point of view", and so I did, and after a few hours head-scratching and diagram-drawing I realized that it was almost certainly a miscarriage.

I mentioned this to a few people I know, and they reacted pretty well as you'd expect to a middle-aged man suddenly getting a bee in his bonnet about a high-profile sex murder case involving pretty girls.

When she was eventually acquitted, various people said "How did you do that?". And the mathematically minded types were quite impressed with the answer, while the muggles think I've got some sort of incomprehensible maths-witchcraft thing that I can do to find out the truth.

Which is exactly the sort of thing you might want to sell, if you can find a way to teach it.

Er, you have some sort of incomprehensible maths-witchcraft thing that you can do to find out the truth.

Fair enough.

I mentioned it only because it seems to have been a unique triumph for Less Wrong.

Why has it been so unique? Surely there are plenty of high-profile predictions one can make using the same Bayesian techniques? (Or one can simply ask gwern, who is apparently well calibrated after a thousand or so recorded predictions.)

The key here was in applying Bayes, not in being especially calibrated.

Well actually I was just wondering about that.

What other claims like 'Amanda Knox is innocent' can we make, in the sense that (a) they're counter common thinking (b) we're pretty sure we're right (c) there's likely to be a resolution in our favour soon?

The Amanda Knox thing was a surprising prediction that came true. More of those would be neat.

GJP?

The Good Judgment Project is basically a prediction tournament. If The Sequences are like kata and PredictionBook is like randori, then prediction tournaments are like competitive Judo.

Steve: Wasn't that the claim of the sophists? "We'll teach you how to win arguments so you can prevail in politics." The problem is that the skills for winning arguments aren't necessarily the schools for rationality in general. Probably the easiest way to learn the skills for winning arguments is to go to law school and "learn to think like a lawyer."

Steve, you're probably right, but I doubt that's really a big obstacle. Rationality has plenty of uses, and if anyone wants to teach it for money, they shouldn't have problems finding ways to promote it, in the long run. Besides, if people want to learn karate to beat people up, why don't dojos advertise based on that?

But I think that the analogy to martial arts is ill-advised for other reasons. Martial arts is a mostly anachronistic practice when evaluated against its original purpose. If you're really serious about self-defense nowadays, you get weapons. (Weapons have always been an advantage, but it's only been for a few hundred years now that they've become such a decisive advantage.)

Where are the tools for giving us a rational advantage? Sure, we have plenty of them for bringing knowledge to us quickly. The internet has been pretty successful at that. But what about other prostheses that specifically target our rational acuity?

Well,

I'll go with the martial arts metaphor, it has at least limited application, and is already the reference point for the discussion. As far as there being no rationality dojos, what about Buddhism, or any other mystic or contemplative path along religious lineages?

As far as measuring rationality or rational skills, the main problem is there is no generally agreed upon Truth or even agreement as to whether Truth even exists. This discussion is not about the actual content of truth per se but about the integrity of the approach to truth. This may be soundly based on the idea that we are only approaching knowledge of truth, but truth itself is still not known. So we cannot measure the soundness as a function of measurable results, since the results are either non-existent,unknown or not agreed upon.

It seems that if this is the case then any form of measurement cannot be objective or externally made, but must be subjective and internally assessed. So the question would become not what is true as fact, but what is true as honest. My vote would be for rational integrity as the context within which skill or understanding would be viewed.

Welcome to Less Wrong, Dylan! Check out the welcome thread and introduce yourself.

I agree that one can develop a pretty impressive Art based on destroying self-deception rather than on seeking truth, particularly when 'seeking truth' sounds like an exceptionally problematic phrase.

But it might not have to be that problematic. Read The Simple Truth if you want to know what Eliezer means by truth; it's much less naive than I'd expected when I was new around here.

I do very much like the criteria of honesty here.

Just as one style of martial arts benefits mainly one area, say the feet, another is totally dedicated to the fists. To measure the growth in each of these forms, there are different 'belts' or levels of mastery based on what you have learned. So perhaps rationality cannot be measured in just one way; the different applications would have to have different levels, just as schools have letter grades on different subjects, and then an overall GPA. Practical usage, such as in the workplace, would have to be 'graded' differently than the same practical usage in a social setting.

Re-reading this brings to mind the history of the Logicians in ancient China. Apparently, they fell out of favor because all that their skills seemed to be good for was winning arguments, and others already had much more effective ways of winning arguments. Advertising rationality as a way to win arguments, as suggested by some of the original commenters, could cause rationality to fall into a similar trap.

I think this problem is avoided by your (later) idea of a Bayesian Conspiracy; if practitioners of beisudo were demonstrably better at the various instrumental tasks they undertook, then that might be enough advertising to attract people to rationality.

I also wonder if cargo-cult rationality would be good or bad. It seems like it could go either way.

I wonder how much of a correlation there is between people who put effort into self-training in rationality (or communal training, a la Less Wrong) and those who actually train a martial art. And I don't mean in the "Now I'lll be able to beat up people Hoo-AH" three-week-course training - I mean real, long-term, long-rewards-curve training. I've done aikido on and off for years (my life's been too hectic to settle down to a single dojo, sadly), and it takes a similar sort of dedication, determination, and self-reflection as a serious foray into training your mind to rationality. And, I'd go so far as to say, a similar 'predilection of mind and preference' (and I'll let you LWers go to town on that one).

What are you meaning by correlation? Do you mean how similar the thinking of those who mindfully approach rationality training is to the thinking of people who seriously dedicate themselves to practicing a martial art? Or do you mean something else?

Why aren't there dojos that teach rationality?

There is. Buddhist temples in the Himalayas (Bhutan and the neighbouring countries) and remote china. I lived in Bhutan as a child. All buddhist monks lived their lifes' in monasteries; meditating, contemplating, doing the daily chores, in introspection and following the teachings of Buddha. Isn't that a "dojo of rationality"?

How to communicate procedural skills of rationality, or measure them, is probably the single largest open issue that stands between humanity and rationality dojos - at least it's the part of the problem that most baffles me.

Discussion; as is with every other field.

Isn't that a "dojo of rationality"?

How is that "rationality"? They spent a lot of time thinking, yes, but not all thought is rational. Not even all introspection is rational.

Well, if we have a Bayesian Conspiracy in place, as it were, then I reckon the test would be derivation of hidden knowledge from available priors. This would require that the Conspiracy have access to answers reasoned out ahead of time, and also that the actual testing was kept secret. Perhaps an aspiring rationalist may only be tested once. If we have that strict a measure, though, we might consider allowing a certain amount of leniency... but then again, perhaps only those who strictly adhere to the Art deserve to move further into its ranks.

I am quite aware that my idea is based largely on a post you made about 16 months after this one; in that sense, I'm simply perpetuating a feedback loop, and may be acting in redundancy. But the more I think on this problem, the more convinced I am that it is a Good Idea: test, ceremony, Conspiracy, and all.

Computer games, done properly, could be a good teacher of rationality. If you fail to observe the laws of nature that happen to apply in the game and act accordingly, you die / can't get to the goal. Also, debugging programs has been good for me: plenty of times I could not see an error in my algorithm, but knew there had to be one (modulo system bugs), and so I learned what does and does not constitute valid argument that way. Put these ideas together (games where you have to program things that work in the game universe) and it'd be interesting. Some AI competitions work that way, but we could probably build an iPad app for kids with the right ideas. For example, I'm not talking about teaching basic or some terrible "easy" language. Why not base the programming language of the game off some other Turing-complete system, like, say, cellular automata. The automata rules could change from time to time to keep them guessing. The game could be a sequence of problems: fill in these squares correctly in order to get the desired result and move to the next level. Actually, maybe Markov algorithms would be better. Or a version of the untyped lambda calculus (http://worrydream.com/AlligatorEggs/). Having a good baseline programming "world" is an important part of getting them to think about the right kinds of abstractions.

Computer games, done properly, could be a good teacher of rationality. If you fail to observe the laws of nature that happen to apply in the game and act accordingly, you die / can't get to the goal.

The laws may carry their own message.

Space Invaders: However high you build your score, the alien horde (i.e. Death) crushes you in the end.

First Person Shooters: Killing your enemies is fun!

Tetris, Kingdom of Loathing, Evony, etc.: Clicky-clicky-clicky! Better than life!

That's not quite what I had in mind, but, well, if we make our own game it can have it's own laws. For example, you gain power by understanding the rules of the game and acting accordingly, not just by reflexes or level grinding. If you understand the rules keenly, you act in subtly different ways that are vastly more effective, perhaps. Maybe you can collaborate to achieve more than you can individually, etc....

Here's a game that, at least, makes you figure out the game mechanics a bit to live: http://www.ludumdare.com/compo/ludum-dare-23/?action=preview&uid=7288 Variations on this requiring more clever tool building, say, would be interesting. But I think it is possible to build a game around a turing machine if we are clever about it.

We've had at least one discussion on video games as a rationality teaching tool in the past.

Thanks. I also noticed Learn to code as related. I'm not sure I should continue here or there. I'll finish here with a link to Blocky maze which is a fairly vanilla example of making coding into a game using a simplified traditional visual programming language, rendered in javascript.

An AI psychologist would stand the best chance of learning human interactions and rational. An AI in that position could quickly learn how we work and understand things. The programmers would also have a great view of how the AI responds.

As for the Dojo idea, we may now have that with, "luminosity.com", possibly the world's first online mental Dojo. A local Dojo would lack the visibility needed to ensure the correct training was being applied. If it were just local it would be more like theology rather than rational learning.

I try to explore ideas and concepts that put people outside the realm or normality on my site thinkonyourown.com, I've found that challenging people's limits is a fantastic way of exercising the mental muscles.

I was arguing the other day with someone online, and I said to him " I think you're trying to throw me off your scent here... You are cornered, and trying to confuse the issue by claiming I'm arguing for this idea, and you've been arguing against it. "

The truth is, that I had not cornered him. The reason I couldn't corner him was in order to corner someone in a rational debate, they have to allow themselves to become cornered... That is, they have to explain themselves so well, what their thoughts and ideas actually are that they are "nailed down".

If their ideas are valid, from a scientific perspective, then this position of being cornered and nailed down, is actually the position of ultimate strength. It is exactly the opposite of where you would want to be in a physical fight.

What is often done, in an argument with a rationalist, who is trying to be cornered and nailed down, so that he can argue from his position of greatest strength, is that the attacker, knowing that he cannot actually attack the argument being defended, must create a strawman argument.

While this does nothing to the beliefs of the person defending the argument, it does confuse other participants or bystanders in the argument, and may succeed in impugning the integrity of a person defending a true idea... e.g., put him in an illusory box that gives others an excuse not to listen to him.

So, as a rule, one can be more successful in seeking truth, by looking for the participant in the debate who is defending and explaining their own ideas, with an honest desire to be cornered and nailed down. They will not feint, or dodge, but will stand where they are, defending their ground, until such time as they are exposed to a better idea.

What empirical evidence do we have that rationality is trainable like martial arts? How do we measure (change of) rationality skills?

-"Deliberately we decide that we want to seek only the truth; but our brains have hardwired support for rationalizing falsehoods."

Deciding that you want to seek only the truth will not give you the truth. This is because, as you say, our brains have hardwired support for rationalizing falsehoods. What I have found to be a better strategy is self-cooperation. Your mind makes its existence known on several different levels like the vocal, the subvocal but still audible, the purely silent but still internally audible, and eventually laughter and similar responses. Each of these levels has different information about the world, and may be willing to share it with the higher levels as long as it does not end up feeling betrayed. But a naive attitude of "I want the truth!" is exactly what is likely to lead to such a betrayal. To put it another way, if a friend told you a secret and then you immediately went to tell that secret to someone else, would they be likely to continue to trust you after that? Even if you said that you were sharing the secret in the name of "truth"? Self-trust works the same way.

Sounds great. Loving this so far.

I'd recommend to others to keep philosophy of science in mind. Philosophy of biology doesn't have the nicest things to say about evolutionary psychology (at least relative to other scientific disciplines). It's not about throwing evopsych out, it's about understanding its limitations in informing us about human nature.

Also, keep in mind an interesting truth I've noticed: you might feel in some cases that you're "in the truth." But that itself is qualitatively a culture like any other. If you justify a hierarchy based on, say, evopsych, you're not "in the truth", you're in yet another a culture that justifies its inequalities through a story (even if that story is scientific and truthful).

Edit: Adding the Stanford Encyclopedia of Philosophy entry on evolutionary psychology: https://plato.stanford.edu/entries/evolutionary-psychology/

Some of the machinery is optimized for evolutionary selection pressures that run directly counter to our declared goals in using it. Deliberately we decide that we want to seek only the truth; but our brains have hardwired support for rationalizing falsehoods. We can try to compensate for what we choose to regard as flaws of the machinery; but we can’t actually rewire the neural circuitry. Nor may martial artists plate titanium over their bones—not today, at any rate.

While the human brain has evolved for survival, its structure can limit our abstract reasoning abilities. One approach to overcoming these limitations is to differentiate between the learning and practicing phases when acquiring a new skill. Contrary to the common belief that "practice makes perfect," it is essential to understand that practice actually makes skills permanent, not necessarily perfect. By acknowledging this distinction, individuals can focus on perfecting their skills through learning processes, such as the Kolb cycle, before committing to practice for long-term retention and improvement.

The Kolb cycle, a well-established model in experiential learning, offers a four-stage process that includes concrete experience, reflective observation, abstract conceptualization, and active experimentation. By applying this process in the context of reasoning, we can iteratively refine our thought processes and become more adept at bypassing the evolutionary constraints of the brain.

I think that you can have a school for rationality, but the people who stick around should be those who achieve new and difficult accomplishments, and the default is that you do not advance or continue without accomplishing something difficult and great.

I suspect it is easier to teach rationality than to do rationality because it is easier to conceive of an ideal than it is to live it. A great athlete works many hours a day with focus and interest. A coach merely watches and pokes them when they're going off-course, or says some motivating words.

I would start a school by forming a set of teachers, but we would be the certifiers, not the certified. Coaches. Rationality coaches who help you in your journey, not rationality teachers who you aspire to be.