The main point
If we are naturally biased and limited in how rational our decision making can be, it might be more effective to improve intuition and manage biases rather than trying to completely eliminate them and focus on unbiased thinking. I’m not advocating irrationality, instead I’m interested in how to deal with bounded rationality. How can we improve our ability to satisfice? How can we productively use and develop our intuition?
While I intend for this post to be informative, it may not be, and I suspect I’ll learn more from writing it and feedback than others will get from reading it. There is much on LessWrong I haven’t read, so I apologize if this is something that was discussed extensively before. If that is the case, I’d appreciate links to those discussions in the comments. One other note, I realize intuition can be interpreted as a dirty word, with good reason. Here, please interpret intuition as a rational heuristic guided by experience. Heuristics are tools, they can be effective when used properly.
This post was prompted by some casual research on expertise and decisionmaking I've been doing for a couple months now. Along the way I came across the work of Herbert Simon and Gary Klein. Simon’s ideas of bounded rationality and satisficing have come up in discussion here before, but I haven’t seen any discussion of Klein, who is an active researcher in the area of intuitive decisionmaking. Simon proposes that rationality in decisionmaking is limited by the constraints of time, available information, and brain/computational capabilities (bounded rationality). Rather than make perfectly rational decisions, we make the best decisions we can given those constraints (satisficing). Klein’s work is primarily focused on how experts actually make decisions in the real world. Put another way, his work is focused on how experts satisfice.
One of Klein’s early research projects was to help the military improve decision making. His proposal was to study firefighters, figuring their experience would be a useful analog to military experience (high pressure, life/death, time...). The major finding was that they did not follow the standard model of good decision making, they did not evaluate multiple options. Instead, they believed they operated by feel. Generally, the firefighters had one option they were considering. After further dialogue, Klein realized that they were forming mental models of the situations based on recognition/experience, which he calls recognition primed decision making. Using those mental models, they would mentally simulate what might occur if they followed their first instinct and go from there. A similar process was seen in other areas of expertise as well. For example, expert chess players aren’t evaluating more options or seeing farther ahead than novices, they just have better recognition/heuristics that are developed through experience. One potential problem with using chess as an analog for other skills is that the rules are clearly defined, all of the information you need is on the board (or in your opponent’s head...). The real world isn’t quite so clean, which which might imply that rationality is generally more limited/bounded than in the case of chess.
From what I’ve been reading, I guess I’m skeptical about efforts to be more rational in everyday life. Certainly it is possible, but it may be unnatural enough to be impractical. Even then, it will still be bounded. If this is the case, it might be more effective to train in some mix of intuitive and rational decisionmaking, while understanding the limits of both. While I haven’t read it yet, I think Daniel Kahneman refers to intuition and rationality as system 1 and 2 in “Thinking Fast and Slow”. While Kahneman has seemed skeptical of our ability to improve decision making and reduce bias in the interviews I’ve read, Klein is a bit more optimistic.
Methods for improving intuition I’ve found so far
(1-5 from Klein/Fadde - Deliberate Performance, 6-7 from Klein - The Power of Intuition)
Intuition comes from experience/expertise. Expertise is developed through deliberate practice. The methods here are intended to accelerate the learning process.
This is fairly simple, estimate something and compare to reality. How long do you expect a project to take? Think through it and discuss with others, and record your estimate and reasoning. Later, after you’ve completed the project, go back and look at your prediction. Where did you go wrong?
One of the keys to effective and efficient learning is timely feedback. If your project will last months, you’ll probably want to do other estimates in the meantime, preferably with tighter feedback loops. An example given in the paper is giving a presentation. How long do you expect discussion to take on each point? What questions will be asked? After the presentation, compare the prediction to reality. For what it’s worth, I believe Peter Drucker also recommends something similar in his article “Managing Oneself”. If I remember correctly, Drucker recommends keeping a decision journal, where you record the decisions you make and return later to see what you got right or wrong (and why).
In “Sources of Power”, Klein discusses an example of extrapolation. Engineers are trying to estimate the cost and time it takes to build various components of airplanes that haven’t yet been built. Since the parts don’t yet exist in some cases, they need to find a way to provide reasonable estimates. To do this, they try to find an analog. Are there any parts that are similar? How long did they take to produce and how much did they cost? It may take combining several analogs to come up with an estimate. The idea is to use what you have and know to model what you don’t have and know. Again, compare the extrapolation to the actual results when you are able to do so, so you can improve next time.
At its most basic, experimentation can be just trying something and seeing what happens. A better way is to form a hypothesis (estimation/prediction/extrapolation) prior to trying.This promotes better learning, because you have a concrete expectation that is either verified or not. If not, you’ve learned something new. Perhaps you’ll even be surprised, and surprises can lead you in new directions.
A more personal example: For the longest time, I had trouble with directions where I currently live. I’d look at a map, figure out how to get somewhere, and try not to vary things too much so I wouldn’t get lost. I rapidly improved by using a GPS, and ignoring it until I needed it. I’d have an idea of how to get there (from looking at maps), but I wouldn’t have it down 100% or I’d try to improvise. The GPS allowed me to experiment without the risk of getting lost, and it provided instant feedback.
This section is rather short, so I am not entirely sure if I understand it correctly. I think the intent is to try to make sense of what you’ve seen and learned from the other three Es.
One way to do this might be to use what Scott Young calls the Feynman Technique (Learn Faster with the Feynman Technique):
Step 1. Choose the concept you want to understand.
Take a blank piece of paper and write that concept at the top of the page.
Step 2. Pretend you’re teaching the idea to someone else.
Write out an explanation of the topic, as if you were trying to teach it to a new student. When you explain the idea this way you get a better idea of what you understand and where you might have some gaps.
Step 3. If you get stuck, go back to the book.
Whenever you get stuck, go back to the source material and re-learn that part of the material until you get it enough that you can explain it on paper.
Step 4. Simplify your language.
The goal is to use your words, not the words of the source material. If your explanation is wordy or confusing, that’s an indication that you might not understand the idea as well as you thought – try to simplify the language or create an analogy to better understand it.
5. Feedback/Coaching (Emulation?)
Feedback is critical. While you might not have a coach at work, you can still find someone to emulate. Assuming they are available for questions, you can try to predict what they would do in a situation. When something does not go as you’d expect, explain your thinking and ask for feedback. In my experience, other people are busy with their own work, so coaching/mentorship takes a backseat to more urgent matters. Plus, some people often just don’t want to be bothered. In this case, I think the best thing to do is to get good at asking effective questions and be well prepared.
Imagine results of what you are trying to do are a complete failure and that you are now doing a post mortem to understand what went wrong. What went wrong? The reasoning behind this technique is that people don’t want something to fail (or look like they want it to fail), assuming it already has failed reduces that bias. Apparently the idea for the technique came from Mitchell, Russo, and Pennington - “Back to the Future: Temporal Perspective in the Explanation of Events”. That paper doesn’t entirely support the technique, stating that it produces more reasons that are episodic, but not necessarily better (they were unable to judge the value of the reasons given). Klein uses this technique regularly in meetings, the general impression is that it reduces confidence in the plan, as intended. From there, they try to prepare for the potential problems.
In a sense, this is similar to red teams but easier to implement and less resource intensive.
7. Identify Decisions/Decision Making Exercises/Decision Making Critiques
In “The Power of Intuition”, Klein advocates identifying decisions where problems have occurred. When reviewing the decisions, note what makes it difficult, what kinds of errors are often made, how an expert might approach it differently than a novice, and how the decision can be practiced and you can get feedback.
Those decisions are then turned into scenarios which can be repeatedly practiced (typically in groups). Start with describing the events that led to the decision. The players are then told what they are trying to achieve, the context, and the constraints. Try to include a visual representation whenever possible.
After the exercise, critique the decision and the process used to make it. Start with a timeline and identify key judgments. For each of the key judgments, note why it was difficult, how you were interpreting the situation, what cues/patterns you should have been picking up, why you chose to do what you did, and what you would’ve done differently with the benefit of hindsight.
Is there interest in this topic on LW? I’m not denying that relying on intuition alone can be dangerous, but I am very skeptical that focusing on reducing bias alone will lead to better decisions. In some cases, it may be better to admit that biases are affecting a decision. One other thing to note, is that bias and mistakes are inevitable. It seems a lot of the LW rationality posts I’ve seen focus on reducing or eliminating these mistakes. That is certainly a valid goal (at least the reduction is), but it isn’t enough. Choice/information overload can affect decisions, as can blood sugar levels (Sweet Future: Fluctuating Blood Glucose Levels May Affect Decision Making) and having to go to the bathroom (Tuk M. et al. (2011). Inhibitory Spillover: Increased Urination Urgency Facilitates Impulse Control in Unrelated Domains. Psychological Science.).
Mistakes will happen, so you’ll have to do your best to learn from them and reduce their cost/make recovery easier. At the same time, good heuristics give you a better starting point to apply the rationality techniques. They are complementary. Worry about reducing bias after you have come up with something using expertise/intuition.
Again, this is a draft, written mostly from memory of what I’ve been reading. The primary sources were “Sources of Power” and “The Power of Intuition” (both by Klein), “The Cambridge Handbook of Expertise”, and scattered readings on/by Simon. Also of interest are this interview with Daniel Kahneman and Gary Klein (free registration required unfortunately) and this paper on deliberate performance by Peter Fadde and Klein.