As Tom slips on the ice puddle, his arm automatically pulls back to slap the ground. He’s been taking Jiu-Jitsu for only a month, but, already, he’s practiced falling hundreds of times. Tom’s training keeps him from getting hurt.
By contrast, Sandra is in her second year of university mathematics. She got an “A” in calculus and in several more advanced courses, and she can easily recite that “derivatives” are “rates of change”. But when she goes on her afternoon walk and stares at the local businesses, she doesn’t see derivatives.
For many of us, rationality is more like Sandra’s calculus than Tom’s martial arts. You may think “overconfidence” when you hear an explicit probability (“It’s 99% likely I’ll make it to Boston on Tuesday”). But when no probability is mentioned -- or, worse, when you act on a belief without noticing that belief at all -- your training has little impact.
Learn error patterns ahead of time
If you want to notice errors while you’re making them, think ahead of time about what your errors might look like. List the circumstances in which to watch out and the alternative action to try then.
Here's an example of what your lists might look like. A bunch of visiting fellows generated this list at one of our rationality trainings last summer; I’m including their list here (with some edits) because I found the specific suggestions useful, and because you may be able to use it as a model for your own lists.
Action ideas, for three related biases:
A. How does it help to know about overconfidence[1]? What can you do differently, once you know your impressions are unreliable?
Action ideas:
- Try many things, including things you “know” won’t work. Try cheap ones.
- Don’t be so sure you can’t do things.
- Don’t be so sure that the things you are doing, are working:
- If a given “necessary” task is using a large portion of your week, test what happens if you skip that task.
- Ask others whether your efforts are working, and what you might try instead. Test their suggestions.
- Ask how you’ll know if you hit your goal: what specific observables will be different? (Not “I’ll know calculus” but “I’ll be able to solve all the problems on the AP calculus test”. Not “I’ll be happier” but “I’ll improve my score on the Beck Depression Inventory”). Track these observables.
- Be suspicious of received wisdom, since others are also overconfident. But don’t just ignore that wisdom in favor of your own error-prone impressions -- look for empirical tests.[2]
- Your friends and family are weirder (more unlike your models) than you think they are. Try to notice how.
B. How does it help to know about the conjunction fallacy? What can you do differently, once you know specific stories are less likely than we generally expect?
Action ideas:
- Use simple or disjunctive plans:
- Choose a (city/college/etc.) in which there are many promising possibilities, not one with a single, highly promising scenario.[3]
- Apply for many jobs, in many sectors of the economy.
- Gather re-purposable resources, such as money, rationality, sanity, capable friends, math skill, reading speed, mental and physical fitness. Focus on fundamentals more than on situation-specific techniques.
- Tell detailed stories when you want to convince someone:
- Describe specific scenarios to angel investors, potential customers, etc.
- Visualize specific scenarios, when you want convince the less verbal parts of yourself that your new (exercise plan / whatever) is worth the effort.
- Don’t put all your caution into safeguarding one particular step. For example, don’t “ensure your start-up will succeed” by focusing only on the programming step, or only on the “where to sell it” step. Brainstorm many ways your plans can go wrong.
- Realize that conjunction-ridden theories (e.g. the Church-Turing thesis[3], or "I will live out my career as a mathematician") are more likely to be mistaken than you might naively think.
C. How does it help to know about confabulation? (I.e., how does it help to know that you are often mistaken about your motives, and that situational factors affect you far more than most people expect?)
Action ideas:
- It’s not just that your beliefs about how to (make money / enjoy your Saturday / learn math / whatever) are probably overconfident. It’s also that they probably weren’t arrived at by asking “How can I do X?” *at all*. So get out a sheet of paper and a ten-minute timer; you may find better ideas immediately.
- Realize you (in a narrow verbal sense) don’t choose most of your actions. Even when you think you do. It’s therefore silly to expect your past choices to be the best choices you could have made, or to make up stories about why your actions were optimal.[5]
- Instead of asking “Why did I do that?”, ask “Why would someone else think I did that, if they were watching only my actions?”[6].
- Since your actions depend greatly on both habits and context:
- Train the actions you want until they’re automatic. Train the thinking habits you want, too. Don’t just verbally acknowledge their desirability.
- If you want robust change, train your new behavior *across contexts*, or tie your new actions to a “portable” context that can remain stable even when you move, change jobs, etc. (For example, build a habit of looking at your goals and mission statement every morning, or using a life coach.)
- Consider aiming for a high-status job, or a job that demands more of you, since others’ expectations may affect you more than you naively think.
- Don’t mistake background knowledge for unchangeable ability.
Do try this at home.
Many of the above examples are not well-tested. So don’t rely on them. But do try them. And, when you do, tell us about it; add your data to the common LW store.
Also, practice this sort of example-generation for any rationality content that you hope to master. Now that you know about Bayes’ theorem, outside view prediction methods, confirmation bias, or any of the others -- what can you do differently at work? in your relationship? while cooking dinner tonight?
The more specific your brainstorm is, the easier it will be to actually try things.
[1] By “overconfidence”, I mean the well-documented bias whereby people think they know more than they do -- I do not mean the bias of over-estimating one’s own abilities.
[2] “Empirical tests” here can include your own direct observations, friends’ anecdotes, published controlled studies, and anything else in the world that should look different, if [received wisdom / your own impression] is true. Many folks just throw up their hands or take a vote when they see folks that disagree with one another; but sorting out the evidence is a learnable skill. It’s worth doing this for medical treatments, job search strategy, driving safety, learning methods, and ... anything else that has much impact on your life.
[3] For example, prefer “I’ll go to college X, where there are many smart people and connections” to “I’ll go to college Y, which is renowned for bioinformatics in particular, since bioinformatics is my lifelong destiny and will let me work for Craig Venter”.
[4] The Church-Turing thesis may not sound like a conjunction. But for it to hold, physics needs to be as we expect along many different dimensions, which is a conjunction, and is the sort of possibility we tend to overestimate. Similarly, there are many different events that could interrupt your planned career, and we tend to overestimate the chances that all of these events, at once, will not occur.
[5] But it isn’t silly to try to make your future actions more (useful/moral/whatever). Even if most actions occur by habit, you can, little by little, change your habits, and increase your self-awareness and your deliberative self-control.
[6] Or: “What would I believe about someone else, if they acted as I’ve been acting?”
Edited to add: Do please comment with your own attempts to turn LW rationality content into the kinds of specifics one can easily act on.
This article made me think of a list I've been informally trying to make, of what stupidity feels like on the inside. The point is to identify when I'm writing code poorly - as the output will probably be even more bugridden than normal, and possibly the output is appropriate to debug-by-starting-over (Though starting over violates my normal policy.)
Stupidity feels like being bored, being in pain, being distracted, wanting to do anything else than this. Stupidity feels like being unworthy of these divine (external) ideas. Stupidity feels like blind plodding obedience. Stupidity feels like lovely and/or grotesque baroque clevernesses.
Trying to stop working and recover when I notice myself being stupid might be the right move, but I think pushing through it (aside from staying up late, which is a mistake) is a better policy. You have to learn to be productive on demand rather than when you're in the mood for it.
"You have to learn to be productive on demand rather than when you're in the mood for it."
Very true, and it's part of the reason I like to arrange structured activities that force me, on a day-to-day basis, to do the things I enjoy. I could have taught myself programming (maybe) but taking a course as my elective forced me to actually write the code when the assignment was due the next morning, as opposed to when I felt like it. It's a good feeling, getting something done and knowing you did a good job, but it's depressing how bad I am at motivat... (read more)