Rationality is a very useful thing to learn, but while there is good curated reading material, it's not always easy reading. More than once I've wanted to introduce someone to the topics, but couldn't hope for them to dig through a pile of essays.
This is an attempt to trim some of the most important texts down into something very concise and approachable. All of the original texts are well worth reading, or I wouldn't be summarizing them. I make no claim to be able to do them justice, but I can try to optimize them for different readers.
When you look at an optical illusion, you're aware that what you're seeing doesn't match reality. As a human, you have the exceptional ability to be able to understand this, that your mental model of the world is not the same as the actual world around you. You are seeing a warped image through a flawed lens. Because you know this, you can manually correct yourself - "no, it's not actually moving" - and have a more accurate model than you would have on autopilot.
Our brains are riddled with systematic errors, mistakes people make so often you could bet money on it. But brains are not magical. The systems making those errors can be understood, anticipated, and corrected for. The human brain is a flawed lens that can see its own flaws. By learning, noticing, and correcting for distortions, the lens can become far more powerful.
You think you have milk in the fridge when you don't, and when you come home milkless from shopping you're disappointed. You had a false belief. Your mental map of the world didn't match reality, and so you're steered into an obstacle you couldn't see. Epistemic rationality is about making your map more accurate - to seek the truth - which makes it much easier to navigate.
Instrumental rationality is using that mental map to steer towards a future you want. To win.
A rational belief is one that is more likely to be true. There are other reasons to believe something, like the belief feeling nice, but you can't draw an accurate map based on what feels nice. Expecting it to be sunny because of the forecast is rational, while expecting sun because you really want sun today is not.
A rational action is one that is more likely to lead to a future you prefer. Getting good sleep before a test is rational, and touching a fire is not (supposing you like grades and not burns).
These are such basic ideas that "rational" often doesn't need to be said. "The sky is blue" instead of "It's rational to believe the sky is blue." "People should eat well" instead of "It is rational for people to eat well."
If we were impossibly powerful calculating machines, we could statistically update our beliefs to perfectly optimize them with each new piece of evidence, and find a plan for the best expected future. People can learn to estimate this math, but no one can do it perfectly or all the time. So rationality is not a solved, perfect algorithm. It's an art, to learning and correcting for our flaws, to seeing past our biases, to finding truth and winning - from within a human mind.
- A student crams and highlights and remembers nothing a week later. - A job interviewee practices by talking to themself, and blunders when asked an unfamiliar question. - A new parent yearns to do a good job, but ends up mostly repeating what their parents did.
Why do people usually pursue their goals ineffectively, when they could find much better ways if they tried? Why would a random child fail a calculus test? Because most possible answers are wrong, and there is nothing to guide them to the correct answers. Like the calculus test, most methods of goal pursuit are bad, and there is no strong force guiding people towards effective methods.
Fairly automatically, we can tell ourselves and others that we're setting a goal, and do things that look on the surface like achieving the goal. But there are many helpful strategies that we do not do automatically. - Ask ourselves what we really want to achieve, and how much the goal is worth to us - Ask ourselves how we can tell if we met our goal, and how we can track progress - Eagerly seek out helpful information for the goal - Try different approaches and track which work - When not exploring options, focusing on what works best - Change our environment to be motivating and free of distractions and frustrations - Many other useful techniques...
Instead we mostly just do things, floating along in autopilot. We do what is habitual, convenient, or feels associated with the goal. But we don't intentionally search for the best path. It may not even occur as an option. Like exercise, knowing the benefits of acting strategically doesn't easily transition into changed behavior. This sort of behavior change doesn't happen automatically - it takes effortful retraining.
The retraining is worth doing. With better chosen and defined goals you can avoid spending time moving in the wrong direction. With better achievement strategy you can accomplish much more than someone floating on autopilot.
Luke: All right, I'll give it a try. Yoda: No! Try not. Do. Or do not. There is no try. Luke raises his hand, and slowly, the X-wing begins to rise out of the water—Yoda's eyes widen—but then the ship sinks again.
An outtake in which George Lucas argues with Mark Hamill, who played Luke Skywalker:
Mark: "So next I say, 'I can't. It's too big'. Shouldn't Luke give it another shot? This is the hero who's going to take down the Empire? Luke should be showing a little backbone."
George: "No. You give up, and you say, 'You want the impossible'."
Mark: "Impossible? The X-wing was already starting to rise out of the swamp! Luke loses it for a second—and now he says it's impossible? Yoda, who's got eight hundred years of seniority in the field, just told him it should be doable. Let's cut to the next scene with the words 'one month later' and Luke is still raggedly standing in front of the swamp, trying to raise his ship for the thousandth time—"
George: "I am not halting the story for five minutes while the X-wing bobs in the swamp like a bathtub toy."
Mark: "If a pathetic loser like this could master the Force, everyone in the galaxy would be using it! The audience isn't going to buy it."
George: "They're not going notice anything out of the ordinary. Look, you don't understand human nature. People wouldn't try for five minutes before giving up if the fate of humanity were at stake."
Someone said that their friend had chest pains, and called an ambulance. The paramedics said it was nothing, and left.
I was confused. I'd read that homeless people would call ambulances to be somewhere warm. The paramedics had to take them, even on the 27th time, to avoid being sued. Anyone reporting chest pains should have been taken by an ambulance.
This is when I failed as a rationalist. I remembered my doctor not worrying at symptoms I found alarming. The doctor was always right. Once I had chest pains, and the doctor explained it was just chest muscle pain. So I said "If they said it was nothing, it must be nothing. They'd take him if there was any chance of trouble."
I explained the story with my existing model, though it felt a little forced . . .
Later he found his friend had made it up. I should have realized a stranger might be less reliable than a journal article. But belief is easy and instinctive, while disbelief requires effort.
I forced my model of reality to explain an anomaly that never happened. I knew how embarrassing this was. The usefulness of a model is not what it can explain, but what it can’t. A model that can allow anything tells you nothing about the world.
Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you can explain any outcome, you have zero knowledge.
I had the information I needed to find the correct answer. I even noticed the confusion, and then ignored it. The confusion was a Clue, and I threw it away.
I should have paid attention to the feeling "it feels a little forced" - one of the most important feelings a truth-seeker can have. It feels like a quiet strain in the back of your mind, but should feel like an alarm siren: Either Your Model Is False Or This Story Is Wrong.
A rationalist must question their beliefs - but there's a problem with doing something because we must. It's common to want to have done hard work, but not want to do it, like wanting to be an author, but not wanting to write. Feeling that you must question your beliefs leads us wanting to have investigated, rather than wanting to investigate.
This leads to motivated stopping, doing the minimum amount of work to feel as if we've questioned our beliefs. Our questioning fast and easy and familiar. We rehearse evidence we already know. We criticize our argument's strong points rather than weak points. Our goal is to have investigated, to keep our old plans and beliefs without guilt.
When we’re actually curious, we'll look for questions likely to change our beliefs - to make us learn. Our mental model should change, and if we're genuinely curious, we should have no preference.
A burning need to learn is far better than just a promise. But we can't force ourselves to need. When you only have a promise, look for sparks of interest, gaps of ignorance, and any possibilities you flinch away from. Remember that each new piece of evidence could equally raise or lower your confidence.
If the box contains a diamond, I desire to believe that the box contains a diamond; If the box does not contain a diamond, I desire to believe that the box does not contain a diamond; Let me not become attached to beliefs I may not want.
Imagine your belief is false. See the value in understanding that it is false, in that world.
If you think you already know something, if you think you can't learn, then you can't learn. Find any shred of uncertainty, guard it and nurse it. Make it blaze into a flame of curiosity, and give yourself direction and purpose.
When Enron was collapsing, the executives never admitted to mistakes. When catastrophe #247 required a policy change, they would say, “Too bad—it was such a good idea” instead of , “It's obvious now that it was a mistake from the beginning” or, “I’ve been stupid.” There was no humbling realization. After the bankruptcy, the former COO testified that Enron had been a great company.
Not every change is an improvement, but every improvement is a change. If we only allow small errors, we can only make small changes. The motivation for a big change comes from acknowledging a big mistake.
When I was younger, I made a very large mistake. After I had finally admitted it, I looked back at the path realizing the mistake, and saw a painfully long series of small concessions, changing my mind as little as possible each time, in small tolerable amounts. I could have moved so much faster, I realized, if I had simply screamed “OOPS!”
There is a powerful advantage to admitting you have made a large mistake. It’s painful. It can also change your whole life.
It is important to have the humbling realization, not divide it into palatable bite-size mistakes, stretching out the battle for years - being Enron.
Avoid thinking "I was right in principle", or "It could have worked." Defending your pride in this moment ensures you will again make the same mistake, and again need to defend your pride. Better to swallow the entire bitter pill in one terrible gulp.
Rationality is the martial art of mind. Martial arts doesn't require powerful muscles. If you have a hand, you can learn to make a fist. If you have a brain, you can learn to use it well. It’s about training the brain-machinery we all have, fixing the errors human brains make. Minds are harder to train than hands, but it is not by bigger muscles that humans rose to control the Earth.
It's easy to find a martial arts dojo. Why aren’t there dojos for rationality? Rationality is harder to verify. There are no boards to break. A teacher cannot easily check your form or demonstrate correctness. It is much harder to pass on techniques to students.
But in the last few decades we have learned much more about rationality. Heuristics and biases, Bayesian probability, evolutionary and social psychology. We may now be able to clearly see the muscles of our brains and develop the martial art of mind: to refine, share, and systematize techniques of rationality.
We must pull the right levers at the right time on a huge thinking machine that is mostly beyond our reach. Some of the machinery is optimized by evolution directly against our goals. Our brains have built-in support for rationalizing falsehoods. We can try to compensate for flaws, but we can’t rewire the machine. But we *are* able to introspect. The inner eye sees a blurry and warped view, but it does see. We must apply the science to our intuitions, to correct our mental movements improve our skills.
We aren't controlling a puppet - it is our own mental limbs that we must move. We must connect theory to practice. We must come to see what the science means, for ourselves, for our daily inner life.
A burning need to know is better than promise to learn. You must recognize your ignorance and desire to solve it. If you think you already know, then you cannot learn. Mysteries are for solving. Ignorance is not a virtue - it is better to relinquish it.
Reliquishment
"That which can be destroyed by the truth, should be.”[1] Do not avoid evidence that could destroy your beliefs. The thoughts you avoid control you most. Relinquish your attachment to your beliefs, and feel what fits the facts. Evaluate your beliefs first, and then arrive at your emotions. Beware, lest you become attached to beliefs you may not want.
Lightness
Let the winds of evidence blow you as though you are a leaf, with no direction of your own. Do not defend against the evidence, learning only forced, as if it's a loss. Surrender to the truth as quickly as you can, as soon as you realize you're resisting the wind. Betray your belief to a stronger enemy. You cannot map a city by sitting with your eyes shut.
Evenness
One who wants to believe asks, “Can I believe?” One who doesn't asks, “Must I believe?” Beware of being skeptical only of ideas you dislike. If you pick and choose your evidence, then the more you research, the less you know. If you're critical of only ideas you dislike, then the better you get at finding flaws, the stupider you become. You are the judge of your theories. Do not argue for one side or another. If you knew the destination, you would already be there.
Argument
Avoiding disagreement prevents learning from those who know better. Strive to be exactly honest in argument, to help yourself and others. It is not a kindness to accept an argument you think is false. It is not "fairness" to accept all positions equally. Find a test that lets reality judge between you.
Empiricism
Observation -> knowledge -> prediction. If a tree falls in a forest and no one hears it, does it make a sound? People may disagree from different definitions, but no one expects an observable difference in the world. Do not ask which beliefs to hold, but which experiences to anticipate. Know which difference-in-experience you argue about, and keep focused on it, or you risk arguing about nothing.
Simplicity
"Perfection is achieved not when there is nothing left to add, but when there is nothing left to take away."[2] When you profess a complex belief, each detail is another chance for the belief to be wrong. Each specification adds to your burden. Lighten your load when you can. A chain of a thousand links breaks at a single point of weakness. Be careful on every step.
Humility
To be humble is to compensate for your expected errors. To confess your fallibility and then do nothing about it is not humble - it is boasting of modesty. Prepare for the errors in your beliefs and plans. It is easy enough to win arguments and become overconfident, but it doesn't matter if others are doing worse. To be human is to make errors.
Perfectionism
The more errors you correct in yourself, the more you notice. Notice an error in yourself is your chance to improve. If you tolerate the error, you won't gain the skill to notice new errors. Perfection is impossible - but we should try. Appreciate how far you've come, but strive always to improve.
Precision
42 is between 1 and 100, and between 40 and 50. Both are correct, but the second is more useful and easily falsified. The narrowest statements say the most. The truth is precise. The power of evidence to update your beliefs is precise. While you can't calculate the exact values, know that they are there, and reach for them.
Scholarship
Study many sciences and absorb their power. If you swallow enough sciences, your knowledge will become a unified whole. Learn what you need to improve your rationality, and what you need to achieve your goals.
The Void
In The Book of Five Rings:
The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him. More than anything, you must be thinking of carrying your movement through to cutting him.[3]
Every step of your reasoning must bring you toward the truth. All that matters is finding the correct answer.
A list of techniques is not the heart of truth-seeking. True rationality is not about "the Way," it is about the truth. If you become very skilled, you may see how all techniques are one technique.
"When you appreciate the power of nature, knowing the rhythm of any situation, you will be able to hit the enemy naturally and strike naturally. All this is the Way of the Void."[3]
Curiosity, relinquishment, lightness, evenness, argument, empiricism, simplicity, humility, perfectionism, precision, scholarship, and the void.
In future should I post summaries individually, or grouped together like this? Individual posts is more linkable and discoverable, but having a post for a full sequence of summaries might be more ergonomic to read and discuss.
Rationality is a very useful thing to learn, but while there is good curated reading material, it's not always easy reading. More than once I've wanted to introduce someone to the topics, but couldn't hope for them to dig through a pile of essays.
This is an attempt to trim some of the most important texts down into something very concise and approachable. All of the original texts are well worth reading, or I wouldn't be summarizing them. I make no claim to be able to do them justice, but I can try to optimize them for different readers.
Akash has written a much shorter summary of all highlights.
Images generated by Midjourney, prompted by the post title.
The Lens That Sees Its Flaws
Full Text by Eliezer Yudkowsky
When you look at an optical illusion, you're aware that what you're seeing doesn't match reality. As a human, you have the exceptional ability to be able to understand this, that your mental model of the world is not the same as the actual world around you. You are seeing a warped image through a flawed lens. Because you know this, you can manually correct yourself - "no, it's not actually moving" - and have a more accurate model than you would have on autopilot.
Our brains are riddled with systematic errors, mistakes people make so often you could bet money on it. But brains are not magical. The systems making those errors can be understood, anticipated, and corrected for.
The human brain is a flawed lens that can see its own flaws. By learning, noticing, and correcting for distortions, the lens can become far more powerful.
What Do We Mean By "Rationality"?
Full Text by Eliezer Yudkowsky
Rationality is about being right and succeeding.
You think you have milk in the fridge when you don't, and when you come home milkless from shopping you're disappointed. You had a false belief. Your mental map of the world didn't match reality, and so you're steered into an obstacle you couldn't see. Epistemic rationality is about making your map more accurate - to seek the truth - which makes it much easier to navigate.
Instrumental rationality is using that mental map to steer towards a future you want. To win.
A rational belief is one that is more likely to be true. There are other reasons to believe something, like the belief feeling nice, but you can't draw an accurate map based on what feels nice. Expecting it to be sunny because of the forecast is rational, while expecting sun because you really want sun today is not.
A rational action is one that is more likely to lead to a future you prefer. Getting good sleep before a test is rational, and touching a fire is not (supposing you like grades and not burns).
These are such basic ideas that "rational" often doesn't need to be said. "The sky is blue" instead of "It's rational to believe the sky is blue." "People should eat well" instead of "It is rational for people to eat well."
If we were impossibly powerful calculating machines, we could statistically update our beliefs to perfectly optimize them with each new piece of evidence, and find a plan for the best expected future. People can learn to estimate this math, but no one can do it perfectly or all the time.
So rationality is not a solved, perfect algorithm. It's an art, to learning and correcting for our flaws, to seeing past our biases, to finding truth and winning - from within a human mind.
People are not Automatically Strategic
Full Text by AnnaSalamon
- A student crams and highlights and remembers nothing a week later.
- A job interviewee practices by talking to themself, and blunders when asked an unfamiliar question.
- A new parent yearns to do a good job, but ends up mostly repeating what their parents did.
Why do people usually pursue their goals ineffectively, when they could find much better ways if they tried?
Why would a random child fail a calculus test? Because most possible answers are wrong, and there is nothing to guide them to the correct answers.
Like the calculus test, most methods of goal pursuit are bad, and there is no strong force guiding people towards effective methods.
Fairly automatically, we can tell ourselves and others that we're setting a goal, and do things that look on the surface like achieving the goal.
But there are many helpful strategies that we do not do automatically.
- Ask ourselves what we really want to achieve, and how much the goal is worth to us
- Ask ourselves how we can tell if we met our goal, and how we can track progress
- Eagerly seek out helpful information for the goal
- Try different approaches and track which work
- When not exploring options, focusing on what works best
- Change our environment to be motivating and free of distractions and frustrations
- Many other useful techniques...
Instead we mostly just do things, floating along in autopilot. We do what is habitual, convenient, or feels associated with the goal. But we don't intentionally search for the best path. It may not even occur as an option.
Like exercise, knowing the benefits of acting strategically doesn't easily transition into changed behavior. This sort of behavior change doesn't happen automatically - it takes effortful retraining.
The retraining is worth doing. With better chosen and defined goals you can avoid spending time moving in the wrong direction. With better achievement strategy you can accomplish much more than someone floating on autopilot.
Use the Try Harder, Luke - Summary
Full Text by Eliezer Yudkowsky
An outtake in which George Lucas argues with Mark Hamill, who played Luke Skywalker:
Mark: "So next I say, 'I can't. It's too big'. Shouldn't Luke give it another shot? This is the hero who's going to take down the Empire? Luke should be showing a little backbone."
George: "No. You give up, and you say, 'You want the impossible'."
Mark: "Impossible? The X-wing was already starting to rise out of the swamp! Luke loses it for a second—and now he says it's impossible? Yoda, who's got eight hundred years of seniority in the field, just told him it should be doable. Let's cut to the next scene with the words 'one month later' and Luke is still raggedly standing in front of the swamp, trying to raise his ship for the thousandth time—"
George: "No."
Mark: "Five goddamned minutes! Five goddamned minutes before he gives up!"
George: "I am not halting the story for five minutes while the X-wing bobs in the swamp like a bathtub toy."
Mark: "If a pathetic loser like this could master the Force, everyone in the galaxy would be using it! The audience isn't going to buy it."
George: "They're not going notice anything out of the ordinary. Look, you don't understand human nature. People wouldn't try for five minutes before giving up if the fate of humanity were at stake."
Your Strength as a Rationalist
Full Text by Eliezer Yudkowsky
Someone said that their friend had chest pains, and called an ambulance. The paramedics said it was nothing, and left.
I was confused. I'd read that homeless people would call ambulances to be somewhere warm. The paramedics had to take them, even on the 27th time, to avoid being sued. Anyone reporting chest pains should have been taken by an ambulance.
This is when I failed as a rationalist. I remembered my doctor not worrying at symptoms I found alarming. The doctor was always right. Once I had chest pains, and the doctor explained it was just chest muscle pain. So I said "If they said it was nothing, it must be nothing. They'd take him if there was any chance of trouble."
I explained the story with my existing model, though it felt a little forced . . .
Later he found his friend had made it up. I should have realized a stranger might be less reliable than a journal article. But belief is easy and instinctive, while disbelief requires effort.
I forced my model of reality to explain an anomaly that never happened. I knew how embarrassing this was. The usefulness of a model is not what it can explain, but what it can’t. A model that can allow anything tells you nothing about the world.
Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you can explain any outcome, you have zero knowledge.
I had the information I needed to find the correct answer. I even noticed the confusion, and then ignored it. The confusion was a Clue, and I threw it away.
I should have paid attention to the feeling "it feels a little forced" - one of the most important feelings a truth-seeker can have. It feels like a quiet strain in the back of your mind, but should feel like an alarm siren: Either Your Model Is False Or This Story Is Wrong.
The Meditation on Curiosity
Full Text by Eliezer Yudkowsky
A rationalist must question their beliefs - but there's a problem with doing something because we must. It's common to want to have done hard work, but not want to do it, like wanting to be an author, but not wanting to write. Feeling that you must question your beliefs leads us wanting to have investigated, rather than wanting to investigate.
This leads to motivated stopping, doing the minimum amount of work to feel as if we've questioned our beliefs. Our questioning fast and easy and familiar. We rehearse evidence we already know. We criticize our argument's strong points rather than weak points. Our goal is to have investigated, to keep our old plans and beliefs without guilt.
When we’re actually curious, we'll look for questions likely to change our beliefs - to make us learn. Our mental model should change, and if we're genuinely curious, we should have no preference.
A burning need to learn is far better than just a promise. But we can't force ourselves to need. When you only have a promise, look for sparks of interest, gaps of ignorance, and any possibilities you flinch away from. Remember that each new piece of evidence could equally raise or lower your confidence.
Imagine your belief is false. See the value in understanding that it is false, in that world.
If you think you already know something, if you think you can't learn, then you can't learn. Find any shred of uncertainty, guard it and nurse it. Make it blaze into a flame of curiosity, and give yourself direction and purpose.
The Importance of Saying "Oops"
Full Text by Eliezer Yudkowsky
When Enron was collapsing, the executives never admitted to mistakes. When catastrophe #247 required a policy change, they would say, “Too bad—it was such a good idea” instead of , “It's obvious now that it was a mistake from the beginning” or, “I’ve been stupid.” There was no humbling realization. After the bankruptcy, the former COO testified that Enron had been a great company.
Not every change is an improvement, but every improvement is a change. If we only allow small errors, we can only make small changes. The motivation for a big change comes from acknowledging a big mistake.
When I was younger, I made a very large mistake. After I had finally admitted it, I looked back at the path realizing the mistake, and saw a painfully long series of small concessions, changing my mind as little as possible each time, in small tolerable amounts. I could have moved so much faster, I realized, if I had simply screamed “OOPS!”
There is a powerful advantage to admitting you have made a large mistake. It’s painful. It can also change your whole life.
It is important to have the humbling realization, not divide it into palatable bite-size mistakes, stretching out the battle for years - being Enron.
Avoid thinking "I was right in principle", or "It could have worked." Defending your pride in this moment ensures you will again make the same mistake, and again need to defend your pride.
Better to swallow the entire bitter pill in one terrible gulp.
The Martial Art of Rationality
Full Text by Eliezer Yudkowsky
Rationality is the martial art of mind. Martial arts doesn't require powerful muscles. If you have a hand, you can learn to make a fist. If you have a brain, you can learn to use it well. It’s about training the brain-machinery we all have, fixing the errors human brains make.
Minds are harder to train than hands, but it is not by bigger muscles that humans rose to control the Earth.
It's easy to find a martial arts dojo. Why aren’t there dojos for rationality?
Rationality is harder to verify. There are no boards to break. A teacher cannot easily check your form or demonstrate correctness. It is much harder to pass on techniques to students.
But in the last few decades we have learned much more about rationality. Heuristics and biases, Bayesian probability, evolutionary and social psychology. We may now be able to clearly see the muscles of our brains and develop the martial art of mind: to refine, share, and systematize techniques of rationality.
We must pull the right levers at the right time on a huge thinking machine that is mostly beyond our reach. Some of the machinery is optimized by evolution directly against our goals. Our brains have built-in support for rationalizing falsehoods. We can try to compensate for flaws, but we can’t rewire the machine. But we *are* able to introspect. The inner eye sees a blurry and warped view, but it does see. We must apply the science to our intuitions, to correct our mental movements improve our skills.
We aren't controlling a puppet - it is our own mental limbs that we must move. We must connect theory to practice. We must come to see what the science means, for ourselves, for our daily inner life.
Twelve Virtues of Rationality
Full Text by Eliezer Yudkowsky
Curiosity
A burning need to know is better than promise to learn. You must recognize your ignorance and desire to solve it. If you think you already know, then you cannot learn. Mysteries are for solving. Ignorance is not a virtue - it is better to relinquish it.
Reliquishment
"That which can be destroyed by the truth, should be.”[1]
Do not avoid evidence that could destroy your beliefs. The thoughts you avoid control you most. Relinquish your attachment to your beliefs, and feel what fits the facts. Evaluate your beliefs first, and then arrive at your emotions. Beware, lest you become attached to beliefs you may not want.
Lightness
Let the winds of evidence blow you as though you are a leaf, with no direction of your own. Do not defend against the evidence, learning only forced, as if it's a loss. Surrender to the truth as quickly as you can, as soon as you realize you're resisting the wind. Betray your belief to a stronger enemy. You cannot map a city by sitting with your eyes shut.
Evenness
One who wants to believe asks, “Can I believe?” One who doesn't asks, “Must I believe?”
Beware of being skeptical only of ideas you dislike. If you pick and choose your evidence, then the more you research, the less you know. If you're critical of only ideas you dislike, then the better you get at finding flaws, the stupider you become.
You are the judge of your theories. Do not argue for one side or another. If you knew the destination, you would already be there.
Argument
Avoiding disagreement prevents learning from those who know better.
Strive to be exactly honest in argument, to help yourself and others. It is not a kindness to accept an argument you think is false. It is not "fairness" to accept all positions equally. Find a test that lets reality judge between you.
Empiricism
Observation -> knowledge -> prediction.
If a tree falls in a forest and no one hears it, does it make a sound? People may disagree from different definitions, but no one expects an observable difference in the world.
Do not ask which beliefs to hold, but which experiences to anticipate. Know which difference-in-experience you argue about, and keep focused on it, or you risk arguing about nothing.
Simplicity
"Perfection is achieved not when there is nothing left to add, but when there is nothing left to take away."[2]
When you profess a complex belief, each detail is another chance for the belief to be wrong. Each specification adds to your burden. Lighten your load when you can. A chain of a thousand links breaks at a single point of weakness. Be careful on every step.
Humility
To be humble is to compensate for your expected errors. To confess your fallibility and then do nothing about it is not humble - it is boasting of modesty. Prepare for the errors in your beliefs and plans.
It is easy enough to win arguments and become overconfident, but it doesn't matter if others are doing worse. To be human is to make errors.
Perfectionism
The more errors you correct in yourself, the more you notice. Notice an error in yourself is your chance to improve. If you tolerate the error, you won't gain the skill to notice new errors.
Perfection is impossible - but we should try. Appreciate how far you've come, but strive always to improve.
Precision
42 is between 1 and 100, and between 40 and 50. Both are correct, but the second is more useful and easily falsified. The narrowest statements say the most. The truth is precise. The power of evidence to update your beliefs is precise. While you can't calculate the exact values, know that they are there, and reach for them.
Scholarship
Study many sciences and absorb their power. If you swallow enough sciences, your knowledge will become a unified whole. Learn what you need to improve your rationality, and what you need to achieve your goals.
The Void
In The Book of Five Rings:
Every step of your reasoning must bring you toward the truth. All that matters is finding the correct answer.
A list of techniques is not the heart of truth-seeking. True rationality is not about "the Way," it is about the truth. If you become very skilled, you may see how all techniques are one technique.
"When you appreciate the power of nature, knowing the rhythm of any situation, you will be able to hit the enemy naturally and strike naturally. All this is the Way of the Void."[3]
Curiosity, relinquishment, lightness, evenness, argument, empiricism, simplicity, humility, perfectionism, precision, scholarship, and the void.
Patricia C. Hodgell, Seeker’s Mask (Meisha Merlin Publishing, Inc., 2001).
Antoine de Saint-Exupéry, Terre des Hommes (Paris: Gallimard, 1939).
Musashi, Book of Five Rings.