Today's post, The Martial Art of Rationality was originally published on November 22, 2006.  A summary (taken from the LW wiki):

Rationality is the martial art of the mind, building on universally human machinery. But developing rationality is more difficult than developing physical martial arts. One reason is because rationality skill is harder to verify. In recent decades, scientific fields like heuristics and biases, Bayesian probability theory, evolutionary psychology, and social psychology have given us a theoretical body of work on which to build the martial art of rationality. It remains to develop and especially to communicate techniques that apply this theoretical work introspectively to our own minds.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them.  It is the first post in the series; the introductory post was here, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort.  You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New to LessWrong?

New Comment
45 comments, sorted by Click to highlight new comments since: Today at 9:15 AM

I wonder how important the lack of rationality professionals (defined here as people who are paid to teach rationality) is regarding the lack of rationality dojos, with money being a powerful motivator for actually getting things done?

Learning to be more rational is important to me, so I would gladly pay more than a token fee to join a rationality dojo, but only if it were planned, operated and vetted by Less Wrongers with reasonably high karma or other credentials. I assume that Less Wrongers in general have similar high standards for spending their money: especially those who have read Money: The Unit of Caring. High-quality planning, operation and vetting not only deserve pay but may also require pay as sufficient incentive.

Of course, voiced willingness and even promises to pay are probably not enough. Perhaps we need a prepaid Rationality Dojo Prize to motivate would-be rationality professionals?

What does Less Wrong think on the value of rationality professionals for rationality dojos?

Personally, if faced with someone who was motivated by money and was prepared to teach rationality as a way of obtaining it, I would be very interested in knowing why they'd chosen that route rather than some other, more lucrative route.

Lack of money threatens survival, so nearly everyone is motivated by money to some extent. You make a good point, but I'm not thinking of people looking to get rich when I refer to would-be rationality professionals: I mean people who are qualified for and passionate about teaching rationality but who also have bills to pay. It may be unrealistic to expect rationality dojos to happen on a purely amateur basis.

Re-reading this post reminded me of Burning Wheel, a table top role playing game that's reward system actively encourages questioning, testing and acting upon the goals and beliefs of a fictional character created by the player, but simultaneously and subversively places the character in complex situations that force the player to change those beliefs over time as a result of the conflicts they cause (and somewhat according to chance). The player has to accept that his character may become something completely alien to how it started during the course of play, yet continue to empathize with it in order to be rewarded for acting out it's actions in the fiction.

Would (re)designing such a game around further encouraging elements of rationality be too close to Dark Arts? (Luke Crane, the game's creator, sometimes speaks about game design as a form of mind control at the gaming conventions he frequents.)

I don't really see how this would be Dark Artsy, but even if it is it sounds crazy fun to play.

How does it make you update your beliefs though? As far as I can tell, it just lets you try things, and upon failure allow you to "succeed" at what you said you wanted to do in a way that makes other things worse.

Oops. I realize now that I was confusing the definition of belief used here with the definition used for the game (a principled to-do list), so the idea isn't as applicable as I originally thought, but I'll try to answer you anyway.

As a player you can change your character's beliefs almost as often as you like and the game rewards you for tailoring them to the context of each scene you enact, with different rewards depending on whether you act in accordance with them or undermine them (this encourages you to have conflicting beliefs, which increases the drama of the shared story). Then, between game sessions, all players involved nominate those beliefs you appear never to undermine for promotion to trait-hood (indicating you've fulfilled your character's goals and they no longer need testing), and those you appear always to undermine for changing. Traits often give game mechanical bonuses and penalties, but can take almost a full story arc of deliberate undermining before being nominated for change.

Conflict in the game is handled in a very specific way. You describe your intent (what you want your character to achieve in the story) and how it is achieved, the GM declares the skill rolls or other game mechanics required and sets the stakes (consequences for failure). If the GM and none of the players can think of an interesting direction a failed roll could take the story in then no roll is made, you get what you wanted and the group moves on to the next, more interesting, conflict. Otherwise, the stakes are negotiated and you choose whether to roll or change your mind. Once the roll is made it's results are irreversible within the fiction.

To a large degree it is up to the GM to create interesting and painful stakes with which to challenge your beliefs, so your mileage will vary.

Ah.

Would it be fair to summarize that as "it makes you update your beliefs insofar as it's made explicit that your character has new goals, and helps you practice changing your mind?"

Yes, that would be fair. Are you aware of any good methods for learning and practicing to be more concise?

Are you aware of any good methods for learning and practicing to be more concise?

Get a Twitter account?

[-][anonymous]13y20

The most powerful method I know for improving clarity and brevity is to write, scrap, and rewrite. An even more powerful version of this is to write, scrap, sleep on it, and rewrite.

Writing concisely is often a time-consuming process, and I don't always take the time.

Hmm, interesting question.

You weren't too bad, you did a lot to explain the game mechanics to people who didn't click through and read the link, even when said explanation didn't directly contribute to the point you were trying to make.

Being concise is difficult in that you're trying to efficiently explain something, while also not leaving out important details necessary for you to be understood (i.e. keeping the inferential distance short).

Knowing what the audience knows often helps in terms of what can be left out.

What I generally do is write something, then revise it to make it shorter, which typically involves:

  • trimming out unneeded grammatical constructions (like "you did a lot towards the goal of" -> "you")

  • rearranging syntax to make the statement more clear (in general, the subject of the sentence should come first, as well as the inferentially closer parts of the statement)

  • general futzing about with the words until you think they're arranged better

Knowing a specific term for something also makes it easier to express without needing to reexplain all of it every time you use it.

"Trying to synthesize a personal art of rationality, using the science of rationality, may prove awkward: One imagines trying to invent a martial art using an abstract theory of physics, game theory, and human anatomy. [...] Therefore we must connect theory to practice."

This is a key realization that strikes me now. Merely reading the sequences or LW posts and absorbing the wisdom of others is insufficient. Replacing old cached knowledge with new doesn't enable a person to advance their art.

To get the most out of a desire to change one's thinking, one has to actively put the theorems to use. The body of accumulated knowledge has to be tested and extended. Feynman's shock at the fragile knowledge of others "learned by rote" appears to me to be similar to Yudkowsky's observation here.

I find this to be the difficult part of re-reading the Sequences. The first time through is full of hard revelations and new ideas. Many of these become adopted, but not fully understood. Successive readings are opportunities to slowly break things down and build a functioning understanding of the principles, but you also have to go out and actually apply the knowledge to internalize it. (I have been passive in that regard and it bothers me.)

And we must, above all, figure out how to communicate the skill; which may not be a matter for declarative statements alone. Martial artists spar with each other, execute standard forms, and are watched throughout by their teachers. Calculus students do homework, and check their answers. Olympic runners continually try to beat their best previous time, as measured by a stopwatch.

Hmm, I haven't seen much progress here on that side. There are a few small calibration tests on the web that have been linked now and then, but not many real exercises.

People are working on exercises, but they don't directly compare rationality skills.

Do people in meetup groups have something where they do stuff like this?

[-][anonymous]13y00

I don't know a lot about martial arts, so I looked them up. Wikipedia, line one:

Martial arts are extensive systems of codified practices and traditions of combat.

So far so good for the analogy. Rationality is a system of practices and traditions of thinking. Wikipedia, line two:

Martial arts all have similar objectives: to physically defeat other persons or defend oneself or others from physical threat.

Now I'm stumped. Does rationality have an objective?

My understanding from movies is that a trained martial artist can defeat an untrained opponent, even if that opponent is larger or has better weapons. Or, as in the post, break a thick board with his fist. What opponents can you defeat or what cool tricks can you do with rationality training, that you couldn't without?

I would say that the analogous objective of rationality is to protect oneself from mental threats: dark arts, misleading questions, tempting but wrong arguments... where specific biases would constitute specific types of attacks.

A couple interesting corellaries from that line of thought: 1) like in a physical situation, mere awareness of the form an attack may take doesn't always help; 2) like martial arts, in mental defense you have the option of developing a large number of highly specific defenses, or a smaller number of more generic ones

It does seem a little limiting to consider rationality nothing more than a mental form of self-defense, but I would argue that the higher levels of martial arts offer far more than that, and like rationality aim (among other things) for holistic life improvement.

An anecdote from my martial arts background:

A student asked, "Sensei, what would you say if I came into the dojo tomorrow and told you I had been attacked in a dark alley, and that I had protected my child who was with me, and defeated my attackers, and escaped unharmed?"

The teacher responded, "I would say that I had failed you as a teacher, because the ultimate goal of our art is not to defeat attackers, but simply not to be present when the attack comes."

Isn't the objective of rationality to correctly align our beliefs with reality, so that they may pay rent when we try to achieve our goals?

Protecting oneself against manipulation, learning to argue correctly and getting used to being defeated are all byproducts of the fact that there is only one reality, independent of the mind.

[-][anonymous]13y00

I like this formulation

The objective of rationality is to correctly align beliefs with reality.

by itself.

I think that we can take something clear and simple from the posts below: rationality should not only help you to accomplish your goals, but also to define goals clearly and identify easy and (more importantly) useful goals that are likely to induce a prolonged (preferably indefinitely so) period of well being.

[-][anonymous]13y60

Can we at least agree that these three imperatives

  1. Believe true things
  2. Achieve goals
  3. Induce well-being

are not identical? There seems to a be "rationality thesis" here that the best way to go about 2. and 3. is to sort out 1. first. I would like to see this thesis stated more clearly.

This may very well be the case today, or in our society, but it's not really difficult to imagine a society in which you have to 'hold' really crazy idea in order to win. Also, believing true things is an endeavour which is never completed per se: it surely is not possible to have it sorted out simpliciter before attaining 2 (the third imperative I really see as a subgoal of the second one).

The thesis after all conflicts with basically all history of humanity: homo sapiens has won more and more without attaining a perfect accuracy. However it seems to me that it had won more where it accumulated a greater amount of truths.

So I won't really say that in order to win you have to be accurate, but I think a strong case can be made that accuracy enhances the probability of winning.

What is then the real purpose of rationality? I'm perfectly fine if we accept the conjunction "truth /\ winning", with the provision that P(winning | high degree of truth) > P(winning | low degree of truth). However, if Omega is going to pop-up and ask:

You must choose between two alternatives. I can give you the real TOE and remove your cognitive bias if you accept to live a miserable life, or you can live a very comfortable and satisfying existence, provided that you let me implant the belief in the flying spaghetti monster.

I confess I would guiltily choose the second.

Sure. The objective of rationality is to achieve your goals as well as possible. Rationality doesn't tell you what you goals are, and martial arts don't tell you which people to defeat.

Rationality doesn't tell you what you goals are, and martial arts don't tell you which people to defeat.

It does, surprisingly. If you don't know what your goals are, there are worse and better ways of figuring that out, with errors on this level having pronounced if subtly hard-to-notice consequences. There is probably even a sense in which it's impossible to know your goals (or their definition, etc.) exactly, to reach a point where you are allowed to stop creatively working on the question.

I agree, that it rationality should help you figure out your instrumental goals, but it's easy to view this as 'a way to better achieve your higher level goals'.

Not just instrumental goals. If you believe that you should achieve something, it doesn't automatically mean that you really should. Your belief is a fact about your brain, which is not always in good alignment with your values (even though it really tries).

When you notice that you want something (as a terminal goal), you are reflecting on the fact that your brain, probably the best value-estimating apparatus you've got, has calculated that pursuing this goal is good. It could be wrong, it's your job now to figure out if it made an error in that judgment. Maybe you can find a way to improve on its reasoning process, compensating for a specific flaw and thus gaining access to a superior conclusion produced by the improved procedure (which is often ultimately the point of knowing how things work). (Or maybe you'll even find an argument that makes taking into account what your own brain tells you in a given instance a bad idea.)

Your belief is a fact about your brain, which is not always in good alignment with your values (even though it really tries).

But where do values reside? How do you know that your belief did not correspond to your values?

Where does truth about arithmetic reside? How can you ever find out that you've miscalculated something? Apply similar principles to moral questions.

[-][anonymous]13y-10

The objective of rationality is to achieve your goals as well as possible.

Too general, and maybe false. Many people, rational and not, are interested in and successful at achieving their goals well. And: less wrong is sometimes a seminar on how to achieve your goals, but it is not always and only that (I hope!).

They are rational to the extent they are interested and successful at achieving their goals.

They are rational to the extent they are interested and successful at achieving their goals.

Imagine two people, Alice and Bob, share the goal of deadlifting X lbs. Alice and Bob are equally "interested and successful at achieving" all their other goals besides deadlifting X lbs. Bob is stronger than Alice. Therefore, he is more likely to be able to deadlift X lbs. Can we thereby conclude that Bob is more rational than Alice?

You say "all else equal" here. But all else clearly isn't equal - they have different genders.

All else being equal, yes I'd expect deadlift weight to be somewhat correlated with rationality.

You say "all else equal" here. But all else clearly isn't equal - they have different genders.

You assumed that Alice was a girl (normally a good guess), but I never mentioned his gender in my thought-experiment. Then again, they have different names, etc...But this misses the point of my "all else equal" clause, which refers to their interestedness and succesfulness (besides their (probable) success at deadlifting), not a myriad of accidental features.

Imagine two people, Alice and Bob, share the goal of deadlifting X lbs. Alice and Bob are equally "interested and successful at achieving" all their other goals besides deadlifting X lbs. Bob is stronger than Alice. Therefore, he is more likely to be able to deadlift X lbs. Can we thereby conclude that Bob is more rational than Alice?

No. It is incredibly weak evidence that Bob is more rational than Alice.

Many people, martial artists and not, are interested in defending themselves and others from physical threat. That doesn't mean that Wikipedia's definition of martial arts is too general or false.

(Although, actually, it's too specific, in this case, since a lot of martial artists are not interested in the defense aspects, but more in physical fitness or enlightenment or whatever).

[-][anonymous]13y00

By "many people" I might have meant "every creature that can be said to have goals at all."

I could quibble with "successful at", but I think the analogy still holds in any case. Virtually everyone is interested in defending themselves, at least, from physical threat.

Martial arts are one approach to being more effective at defense, and rationality is a similar approach to being more effective at reaching goals in general.

[-][anonymous]13y00

We should absolutely be quibbling about "successful." Someone comes to me with advice for achieving my goals: "I know just the ticket, all you have to do is swallow this giant pack of lies." Well, couldn't they be right?

I think it's a rare individual who would actually be in less physical danger if they were better at martial arts. The scope of rationality is similarly limited -- it's not useful for every one, or for every goal.

I think it's a rare individual who would actually be in less physical danger if they were better at martial arts.

Do you think that because you believe most people don't experience physical danger? Or because you think that martial arts is ineffective in dealing with the most common types of danger? Or some other reason?

[-][anonymous]13y20

I think martial arts are unnecessary for dealing with the most common types of danger.

The most valuable lesson I ever learned from martial arts was how to fall down without hurting myself, and I'd say this is a skill that would help most people significantly reduce the number and severity of physical injuries they experience over their lifetime.

Tangential point: breakfall is the exact wrong thing to do if you've lost your balance while jumping on a trampoline -- found that one out the hard way. But really this comment should be filed under Cached Thoughts.

[-][anonymous]13y10

That's interesting. Is that a consequence of your holistic knowledge of martial arts or a single technique that could be taught on its own? Can the technique be taught e.g. to elderly people who are not in good shape?

It's actually a corpus of techniques that can be taught separately from the rest of the martial arts syllabus. Collectively they are called "breakfall".

ETA:

Can the technique be taught e.g. to elderly people who are not in good shape?

There are very gentle intro exercises which involve starting from a seated position; however, it's conceivable that a sufficiently frail person might not be able to manage even those.

Intro judo classes emphasize safe falling quite a bit. I have no idea if anyone teaches judo to elderly people, though.

The knowledge is basically muscle memory: we didn't spend a lot of time learning the formal breakfall techniques, so much as every class involved falling or being knocked over from a variety of awkward positions, on the order of 100 times per class. So although it might be possible to teach the elderly the techniques (Cyan sounds like ey knows more about this than I do), the way I learned them probably wouldn't be a good way to do it.

I have found the experience transferrable, though, to situations like skiing, slipping on icy ground, crashing my bike, etc.