It is generally assumed around here that people can learn to be more rational. That's the purpose of The Sequences, after all. And veteran Less Wrongers do seem (to me) vastly more rational than the average person.
But maybe it's a selection effect: maybe Less Wrong doesn't make people more rational, it's just that the people who are already relatively rational are the ones most likely to be attracted Less Wrong.
Daniel Willingham (2008) thinks it's pretty hard to teach rationality / critical thinking,1 but what evidence do we have on the matter? Is rationality teachable?
Statistics and logic training
Statistics training appears to help. Schoemaker (1979) found that students who had taken a statistics course gave more consistent answers to questions about gambles than those who hadn't taken a statistics course. Students who had not taken the course were also more likely to bid more money than they could possibly win.
Fong et al. (1986) found that statistical training sometimes transfers to real-world decision making. Half the men in a statistics course were interviewed at the beginning of the course, and the other half were interviewed at its end. The interview was ostensibly about sports, but was intended to test for skills in applying statistics to everyday life. The men interviewed after the course did significantly better in giving statistics-informed answers than those answered at the beginning of the course.
For example, when asked why a Rookie of the Year in baseball usually does less well in his second year than in his first year, those interviewed at the beginning of the course tended to give answers like "he's resting on his laurels; he's not trying so hard his second year," while those interviewed at the end of the course tended to give answers which appreciated regression toward the mean.
What about logic? Nisbett et al. (1987) conducted several tests on the effects of different kinds of training on logical skills. Perhaps surprisingly, they found that logical skills (as measured by their tests) did not improve during courses on formal logic, informal fallacies, or two years of graduate courses in chemistry, but did improve as a result of two years of graduate courses in law, medicine, or psychology. Additionally, Lipman (1983) found that a Philosophy for Children course increased the logical thinking skills of children in grades four to eight.
Debiasing
To examine the conflict that can arise between fast, intuitive reasoning and slow, deliberate reasoning, consider the following problems:
Imagine that in order to win a prize you have to pick a red marble from one of two urns (Urn A and B). Urn A contains 20 red and 80 blue marbles, and Urn B contains 1 red and 9 blue marbles. When you respond to the task you can compare the ratio of winning marbles in each urn (20% vs 10%), which requires some time, mental effort, and computations, or you can simply rely on the feeling/intuition that it is preferable to pick from the urn with more 'favourable events'. In this example both processes cue the normatively correct answer (that is, Urn A). On the other hand, it is possible to set up a task where [intuitive] and [deliberative] reasoning cue different responses. For example, if you can choose between picking a marble from an urn containing 10 red and 90 blue marbles, or from an urn containing 2 red and 8 blue marbles, the feeling/intuition that it is preferable to pick from the urn with more 'favourable events' results in a normatively incorrect choice.2
When faced with such conflicts, most people tend to give intuitive answers rather than normatively correct answers.3 However, some studies have found that those with greater cognitive capacity (more working memory, etc.) are less likely to use intuitions inappropriately.4 Moreover, biases increase when working memory is burdened with cognitive load.5 On the other hand, many biases are just as prevalent among people of high intelligence and greater cognitive capacity as among others.6
So: does debiasing work? Can resistance to cognitive biases be taught?
Preliminary evidence suggests that debiasing can be done:
- A simple instruction to "think about alternatives" can promote resistance to overconfidence and confirmation bias. In one study, subjects asked to generate their own hypotheses are more responsive to their accuracy than subjects asked to choose from among pre-picked hypotheses.7 Another study required subjects to list reasons for and against each of the possible answers to each question on a quiz prior to choosing an answer and assessing the probability of its being correct. This process resulted in more accurate confidence judgments relative to a control group.8
- Training in microeconomics can help subjects avoid the sunk cost fallacy.9
- Because people avoid the base rate fallacy more often when they encounter problems phrased in terms of frequencies instead of probabilities,10 teaching people to translate probabilistic reasoning tasks into frequency formats improves their performance.11
- Warning people about biases can decrease their prevalence. So far, this has been demonstrated to work with regard to framing effects,12 hindsight bias,13 and the outcome effect,14 though attempts to mitigate anchoring effects by warning people about them have produced weak results so far.15
Conclusion
To be sure, many attempts to teach rationality have failed. But with results like those cited above, when I consider the prospects for teaching rationality I am hopeful, and left with a sense that more is possible. I believe we can indeed raise the sanity waterline.
Notes
1 'Critical thinking' usually refers to a subset of some of the most basic skills sometimes called 'rationality skills' by Less Wrongers, and usually the more qualitative forms of those skills, and is thus more aligned with Traditional Rationality than with Technical Rationality. As Willingham puts it:
In layperson’s terms, critical thinking consists of seeing both sides of an issue, being open to new evidence that disconfirms your ideas, reasoning dispassionately, demanding that claims be backed by evidence, deducing and inferring conclusions from available facts, solving problems, and so forth.
2 Chiesi et al. (2011).
3 See, e.g. Klaczynski (2001).
4 Stanovich & West (2000, 2008a).
5 De Neys (2006a, 2006b).
6 Stanovich & West (2008b, 2008c).
7 Koehler (1994).
8 Koriat et al. (1980). Also see Soll & Klayman (2004); Mussweiler et al. (2000).
9 Larrick et al. (1990).
10 Gigerenzer & Hoffrage (1995).
11 Sedlmeier (1999).
12 Cheng & Wu (2010).
13 Hasher et al. (1981); Reimers & Butler (1992).
14 Clarkson et al. (2002).
15 Block & Harper (1991); George et al. (2000).
References
Block & Harper (1991). Overconfidence in estimation: testing the anchoring-and-adjustment hypothesis. Organizational Behavior and Human Decision Processes, 49: 188–207.
Cheng & Wu (2010). Debiasing the framing effect: The effect of warning and involvement. Decision Support Systems, 49: 328-334.
Chiesi, Primi, & Morsanyi (2011). Developmental changes in probabilistic reasoning: The role of cognitive capacity, instructions, thinking styles, and relevant knowledge. Thinking and Reasoning, 17: 315–350.
Clarkson, Emby, & Watt (2002). Debiasing the effect of outcome knowledge: the role of instructions in an audit litigation setting. Auditing: A Journal of Practice and Theory, 21: 1–14.
De Neys (2006a). Dual processing in reasoning: Two systems but one reasoner. Psychological Science, 17: 428–433.
De Neys (2006b). Automatic-heuristic and executive-analytic processing in reasoning: Chronometric and dual task considerations. Quarterly Journal of Experimental Psychology, 59: 1070–1100.
Fong, Krantz, and Nisbett (1986). The effects of statistical training on thinking about everyday problems. Cognitive Psychology, 18: 253-292.
George, Duffy, & Ahuja (2000). Countering the anchoring and adjustment bias with decision support systems. Decision Support Systems, 29: 195–206.
Gigerenzer & Hoffrage (1995). How to improve Bayesian reasoning without instruction: Frequency formats. Psychological Review, 102: 684–704.
Hasher, Attig, & Alba (1981). I knew it all along: or did I? Journal of Verbal and Learning Behavior, 20: 86-96.
Klaczynski (2001). Framing effects on adolescent task representations, analytic and heuristic processing, and decision making: Implications for the normative/descriptive gap. Journal of Applied Developmental Psychology, 22: 289-309.
Koehler (1994). Hypothesis generation and confidence in judgment. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20: 461-469.
Koriat, Lichtenstein, & Fischhoff (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6: 107-118.
Larrick, Morgan, & Nisbett (1990). Teaching the use of cost-benefit reasoning in everyday life. Psychological Science, 1: 362-370.
Lipman (1983). Thinking Skills Fostered by Philosophy for Children. Institute for the Advancement of Philosophy for Children.
Mussweiler, Strack, & Pfeiffer (2000). Overcoming the inevitable anchoring effect: Considering the opposite compensates for selective accessibility. Personality and Social Psychology Bulletin, 26: 1142–50.
Nisbett, Fong, Lehman, and Cheng (1987). Teaching reasoning. Science, 238: 625-631.
Reimers & Butler (1992). The effect of outcome knowledge on auditor's judgmental evaluations. Accounting, Organizations and Society, 17: 185–194.
Sedlmeier (1999). Improving Statistical Reasoning: Theoretical Models and Practical Implications. Erlbaum.
Shoemaker (1979). The role of statistical knowledge in gambling decisions: Moment vs. risk dimension approaches. Organizational Behavior and Human Performance, 24: 1-17.
Soll & Klayman (2004). Overconfidence in interval estimates. Journal of Experimental Psychology: Learning, Memory, and Cognition, 30: 299–314.
Stanovich & West (2000). Individual differences in reasoning: Implications for the rationality debate. Behavior Brain Science, 23: 645–726.
Stanovich & West (2008a). Evolutionary versus instrumental goals: How evolutionary psychology misconceives human rationality. In Over (ed.), Evolution and the psychology of thinking (pp. 171–230). Psychology Press.
Stanovich & West (2008b). On the failure of cognitive ability to predict myside and one-sided thinking biases. Thinking and Reasoning, 14: 129-167.
Stanovich & West (2008c). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94: 672-695.
Willingham (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109: 21-32.
It's possible that the camel has two humps (pdf). We don't have much evidence regarding the shapes of the subskill distributions. The connection between "rationality" and debiasing is questionable (correlation versus causality selection effects tag alongs bla bla bla). Beware motivated cognition on this subject.
Speaking of the camel has two humps, replication has turned out to be difficult and the idea doesn't look very good anymore: see first quote in http://www.gwern.net/Notes#the-camel-has-two-humps