Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Hindsight bias

50 Post author: Eliezer_Yudkowsky 16 August 2007 09:58PM

Hindsight bias is when people who know the answer vastly overestimate its predictability or obviousness, compared to the estimates of subjects who must guess without advance knowledge.  Hindsight bias is sometimes called the I-knew-it-all-along effect.

Fischhoff and Beyth (1975) presented students with historical accounts of unfamiliar incidents, such as a conflict between the Gurkhas and the British in 1814.  Given the account as background knowledge, five groups of students were asked what they would have predicted as the probability for each of four outcomes: British victory, Gurkha victory, stalemate with a peace settlement, or stalemate with no peace settlement.  Four experimental groups were respectively told that these four outcomes were the historical outcome.  The fifth, control group was not told any historical outcome.  In every case, a group told an outcome assigned substantially higher probability to that outcome, than did any other group or the control group.

Hindsight bias matters in legal cases, where a judge or jury must determine whether a defendant was legally negligent in failing to foresee a hazard (Sanchiro 2003). In an experiment based on an actual legal case, Kamin and Rachlinski (1995) asked two groups to estimate the probability of flood damage caused by blockage of a city-owned drawbridge. The control group was told only the background information known to the city when it decided not to hire a bridge watcher. The experimental group was given this information, plus the fact that a flood had actually occurred. Instructions stated the city was negligent if the foreseeable probability of flooding was greater than 10%. 76% of the control group concluded the flood was so unlikely that no precautions were necessary; 57% of the experimental group concluded the flood was so likely that failure to take precautions was legally negligent. A third experimental group was told the outcome andalso explicitly instructed to avoid hindsight bias, which made no difference: 56% concluded the city was legally negligent.

Viewing history through the lens of hindsight, we vastly underestimate the cost of effective safety precautions.  In 1986, the Challenger exploded for reasons traced to an O-ring losing flexibility at low temperature.  There were warning signs of a problem with the O-rings.  But preventing the Challenger disaster would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight.  It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings.

Shortly after September 11th 2001, I thought to myself, and now someone will turn up minor intelligence warnings of something-or-other, and then the hindsight will begin.  Yes, I'm sure they had some minor warnings of an al Qaeda plot, but they probably also had minor warnings of mafia activity, nuclear material for sale, and an invasion from Mars.

Because we don't see the cost of a general policy, we learn overly specific lessons.  After September 11th, the FAA prohibited box-cutters on airplanes—as if the problem had been the failure to take this particular "obvious" precaution.  We don't learn the general lesson: the cost of effective caution is very high because you must attend to problems that are not as obvious now as past problems seem in hindsight.

The test of a model is how much probability it assigns to the observed outcome.  Hindsight bias systematically distorts this test; we think our model assigned much more probability than it actually did.  Instructing the jury doesn't help.  You have to write down your predictions in advance.  Or as Fischhoff (1982) put it:

When we attempt to understand past events, we implicitly test the hypotheses or rules we use both to interpret and to anticipate the world around us. If, in hindsight, we systematically underestimate the surprises that the past held and holds for us, we are subjecting those hypotheses to inordinately weak tests and, presumably, finding little reason to change them.

 

Part of the sequence Mysterious Answers to Mysterious Questions

Next post: "Hindsight Devalues Science"

Previous post: "Conservation of Expected Evidence"


Fischhoff, B. 1982. For those condemned to study the past: Heuristics and biases in hindsight. In Kahneman et. al. 1982: 332–351.

Fischhoff, B., and Beyth, R. 1975. I knew it would happen: Remembered probabilities of once-future things. Organizational Behavior and Human Performance, 13: 1-16.

Kamin, K. and Rachlinski, J. 1995. Ex Post ≠ Ex Ante: Determining Liability in Hindsight. Law and Human Behavior, 19(1): 89-104.

Sanchiro, C. 2003. Finding Error. Mich. St. L. Rev. 1189.

Comments (22)

Sort By: Old
Comment author: Robin_Hanson2 16 August 2007 10:45:17PM 3 points [-]

So the obvious solution is to write down forecasts in advance. And of course in the particular cases where hindsight bias is larger, this will produce a large benefit. But some might worry about hindsight bias in recommending advance forecasts, as it is not so easy to tell ahead of time which situations will have the worst hindsight bias. How can we get an unbiased estimate of the value of overcoming hindsight bias with advance forecasts?

Comment author: Nathan_Iver_O'Sullivan 17 August 2007 12:41:18AM 0 points [-]

Chapter 11 of the 9/11 commission's report, available here, shows the commission was very wary of hindsight bias. The failure to prevent the attacks is said to represent a "failure of imagination," meaning the intelligence community used the wrong model in evaluating terrorist threats.

Comment author: bigjeff5 29 January 2011 03:13:22AM 6 points [-]

If you note the study in the article, 56% of those told about the flood but warned to avoid hindsight bias stated the city was negligent, compared to 57% of those told about the flood but not warned to avoid the hindsight bias stated the city was negligent.

76% of the control group, without the benefit of hindsight, concluded the chances of failure were so remote the city could not be held negligent.

Just being aware that you have a potential hindsight bias is clearly meaningless if you have no method for removing the bias.

That said, the "failure of the imagination" sounds reasonable, but it's about as useful as my horoscope. I.e. it's not.

Comment author: Aaron_Luchko 17 August 2007 01:24:50AM 1 point [-]

This made me think of a specific instance of hindsight bias that always annoys me. Consider any game of chance where at some point the person is given the choice of whether to make a wager or not.

Once they see how the wager would have turned out one is almost guaranteed that if the wager would have won they'll say to make the wager would be the right decision and if the wager would have lost vice-versa. This holds even if they were already aware of the odds before hand.

Comment author: pdf23ds 17 August 2007 02:10:57AM 2 points [-]

Eliezer, I'm curious as to what you think of Feynman's take on the Challenger disaster. Do you think he was succumbed to hindsight bias in his judgments or recommendations?

Comment author: Eliezer_Yudkowsky 17 August 2007 02:24:18AM 6 points [-]

It appears to me that Feynman did his best to talk about a general policy that would have been required to prevent all problems of the same level as seen without benefit of hindsight, rather than saying "Why didn't you fix the O-Rings, you idiots?" or setting up a Low Temperature Testing Board.

Comment author: Andrew2 17 August 2007 03:58:47AM 0 points [-]

Eliezer,

You write: "I'm sure they had some minor warnings of an al Qaeda plot, but they probably also had minor warnings of mafia activity, nuclear material for sale, and an invasion from Mars." I doubt they had credible warnings about an invasion from Mars. But, yeah, I'd like the FBI etc. to do their best to stop Al Quaeda plots, Mafia activity, and nuclear material for sale. I wonder if you're succumbing to a "bias-correction bias" where, because something _could_ be explainable by a bias, you assume it _is_. Groups of people do make mistakes, some of which could have been anticipated with better organization and planning. I have essentially no knowledge of the U.S. intelligence system, but I wouldn't let them off the hook just because a criticism could be simply hindsight bias. Sometimes hindsight is valid, right?

Comment author: Eliezer_Yudkowsky 17 August 2007 04:11:14AM 5 points [-]

The notion being that following up on all warnings of equal then-apparent severity, without benefit of hindsight, would have been a prohibitively expensive general policy. Especially since you would not have any information about "terrorism" being the pet problem of the '00s, rather than, say, an unpaid military officer launching a Russian ICBM, runaway greenhouse warming, a home biologist making superviruses, asteroids, unFriendly AI, etc.

It's all very well to talk about mistakes that could have been anticipated, yet somehow, they don't seem to be anticipated.

Comment author: David_J._Balan 23 August 2007 05:23:02PM 0 points [-]

Of course it's always hard to know what truth is in situations like this, but there appears to be evidence that the people who were actually in charge of preventing terrorism were actively worried about something much like what actually happened, and were ignored by their superiors.

Comment author: Eliezer_Yudkowsky 23 August 2007 05:34:54PM 3 points [-]

David, which other 50 things were they actively worried about?

As Fischoff (1982, above) writes:

Any propensity to look no further is encouraged by the norm of reporting history as a good story, with all the relevant details neatly accounted for and the uncertainty surrounding the event prior to its consummation summarily buried, along with any confusion the author may have felt (Gallie, 1964; Nowell-Smith, 1970). Just one of the secrets to doing this is revealed by Tawney (1961): "Historians give an appearance of inevitability to an existing order by dragging into prominence the forces which have triumphed and thrusting into the background those which they have swallowed up" (p. 177).'

Gallie, W. B. Philosophy and the historical understanding. London: Chatto & Windus, 1964.

Nowell-Smith, P. H. Historical explanation. In H. E. Keifer & M. K. Munitz (Eds.), Mind, science and history. Albany, N. Y.: State University of New York Press, 1970.

Tawney, R. H. The agrarian problems in the sixteenth century. New York: Franklin, 1961.

Comment author: Sebastian_Hagen 07 December 2009 08:52:46PM *  3 points [-]

Fischhoff and Beyth (1975) presented students with historical accounts of unfamiliar incidents, such as a conflict between the Gurkhas and the British in 1814.

...

Fischhoff, B., and Beyth, R. 1975. I knew it would happen: Remembered probabilities of once-future things. Organizational Behavior and Human Performance, 13: 1-16.

I originally came across the same citation in Cognitive biases potentially affecting judgment of global risks. It refers to this paper, correct? Title, authors, publication and date appear to match.

I've looked at that PDF, and I don't see where the paper talks about an experiment with questions regarding a British-Gurkha conflict. The PDF is searchable. There's no full-text search matches for "Gurkha" or "British". "students" yields matches on exactly one page, and that's about an experiment using a different set of questions. I haven't read the entire thing in any depth, so I may have missed a description of the British/Gurkha study. If so, where in the paper is it?

Comment author: Blueberry 07 December 2009 10:43:05PM 0 points [-]

This looks like it might be helpful: http://sds.hss.cmu.edu/media/pdfs/fischhoff/HindsightEarlyHistory.pdf

Looks like that particular experiment was discussed in a different paper.

Comment author: ciphergoth 04 March 2010 01:36:06PM 2 points [-]

Hindsight ≠ foresight: the effect of outcome knowledge on judgment under uncertainty

B Fischhoff

Correspondence to:
 B Fischhoff, Hebrew University of Jerusalem, Israel

Abstract

One major difference between historical and nonhistorical judgment is that the historical judge typically knows how things turned out. In Experiment 1, receipt of such outcome knowledge was found to increase the postdicted likelihood of reported events and change the perceived relevance of event descriptive data, regardless of the likelihood of the outcome and the truth of the report. Judges were, however, largely unaware of the effect that outcome knowledge had on their perceptions. As a result, they overestimated what they would have known without outcome knowledge (Experiment 2), as well as what others (Experiment 3) actually did know without outcome knowledge. It is argued that this lack of awareness can seriously restrict one’s ability to judge or learn from the past.

http://qshc.bmj.com/content/12/4/304.abstract

Comment author: simplicio 17 July 2010 05:36:32AM *  21 points [-]

Hindsight: a fable

A lark and a wren, perched on the top of a tall tree, were conversing once about the dangers of cuckholdry.

Said the lark, “My sister was fooled by a cuckoo only last year; in her nest were three eggs, one unlike the others. That vile chick ate all the food that she could supply, until it was ready to burst from gluttony.”

“What a fool is your sister!” said the wren. “One egg was not like the others. The deception is surely obvious. I should not have made such a mistake.”

A cuckoo, overhearing, sped fast away to the wren’s nest, where she found three small eggs. Pushing two over the side, she laid her own pair of eggs next to the wren’s remaining one.

Returning, and thinking herself wise, the wren pushed her one egg out of the nest and settled down to warm the remaining two.

Comment author: albert 06 November 2011 05:47:21PM *  -1 points [-]

In the first example of this article (Gurkha x British prediction), doesn't having the data of the outcome change your ex-ante estimate of what the probability was? Since it's a data point you now have and you can't erase it from your mind, it's rational to update your estimates no? The bias in my mind would be if you OVERLY adjust your probability distribution based on the outcome.

Comment author: JoshuaZ 06 November 2011 06:04:43PM 1 point [-]

No. That's exactly the problem. Updating after the fact for what might be likely in that sort of situation is ok. The problem as discussed in the article is that people are then convinced that it really should have been obvious to someone without that data point.

Comment author: listo 08 January 2012 09:25:46AM 11 points [-]

This post didn't say anything new to us. We knew it all along.

Comment author: Vivi 31 March 2012 04:39:05PM 2 points [-]

A small error in this sentence:

A third experimental group was told the outcome andalso explicitly instructed to avoid hindsight bias, which made no difference

The conjunction should be "and also".

Comment author: Tenoke 13 April 2012 08:01:01AM *  0 points [-]

I am very interested as to what would've happened if there was a 4th experimental group (or a new experiment) which is told the outcome, told to avoid hindisght bias and told that in previous experiments being told to avoid hindsight bias did nothing to acctually reduce the effect of hindsight bias.

Comment author: TraderJoe 02 May 2012 12:03:01PM 4 points [-]

I was taught in my history classes at school that WWI was known to be coming, that Europe was divided into two camps and that it was just a matter of time until someone lit the fuse, etc. In fact, I grew up believing that everyone in Europe knew this at the time.

More recently, I read Taleb's summary of Niall Ferguson's study on bond prices, which showed that Europe's bond markets did not assign a high probability to the chance of war. So investors, at least, did not predict a war was coming.

NB I haven't read the full study [55 pages], only a summary.

Comment author: AnthonyC 14 December 2013 01:41:44PM 2 points [-]

"I was taught in my history classes at school that WWI was known to be coming, that Europe was divided into two camps and that it was just a matter of time until someone lit the fuse, etc."

I was taught that as well.

"In fact, I grew up believing that everyone in Europe knew this at the time." In contrast, I was taught that one big problem was that many of the alliances that led to escalation from local to global conflict were secret. Each nation in turn declared war to protect its allies, which drew opposition allies into the war, etc., but no nation in advance had a complete accounting of who had made what military promises to whom. Governments or wealthy, well-connected citizens might have suspected, but not in great detail and it was probably not something they dwelt on - after all those treaties were for mutual defense, how could they be a bad thing (my speculation on one possible pre-hindsight view)?

"

Comment author: beoShaffer 04 June 2012 05:08:18PM 0 points [-]

I've seen several references to a latter study by Fischhoff called "Perceived informativeness of facts" which supposedly found that asking participants for arguments in support of the result that didn't happened can reduce hindsight bias. Unfortunately, since I can't access the original article I don't know how practical the effect is. Similarly, "Considering the Opposite: A Corrective Strategy for Social Judgment" indicates that asking people to consider how they would interpret different results found using the same methodology leads to less biased interpretations of studies. It also suggests that explaining explaining the details of how biased cognition happens is a more effective intervention that just telling people about the end result.