Followup toUnteachable Excellence

As previously observed, extraordinary successes tend to be considered extraordinary precisely because it is hard to teach (relative to the then-current level of understanding and systematization).  On the other hand, famous failures are much more likely to contain lessons on what to avoid next time.

Books about epic screwups have constituted some of my more enlightening reading.  Do you have any such books to recommend?

Please break up multiple recommendations into multiple comments, one book per comment, so they can be voted on and discussed separately.  And please say at least a little about the book's subject and what sort of lesson you learned from it.

New Comment
53 comments, sorted by Click to highlight new comments since:
[-][anonymous]220

What Went Wrong?: Case Histories Of Process Plant Disasters, Fourth Edition by Trevor Kletz

Subject: Disasters at (petro)chemical plants. One example:

"The most famous of all temporary modifications is the temporary pipe installed in the Nypro Factory at Flixborough, UK, in 1974. It failed two months later, causing the release of about 50 tons of hot cyclohexane. The cyclohexane mixed with the air and exploded, killing 28 people and destroying the plant.... No professionally qualified engineer was in the plant at the time the temporary pipe was built. The men who designed and built it (design is hardly the word because the only drawing was a full-scale sketch in chalk on the workshop floor) did not know how to design large pipes required to operate at high temperatures (150 degrees C) and gauge pressures (150 psi or 10 bar). Very few engineers have the specialized knowledge to design highly stressed piping. But in addition, the engineers at Flixborough did not know that design by experts was necessary. They did not know what they did not know."

Lessons: "A high price has been paid for the information in this book: many persons killed and billions of dollars worth of equipment destroyed. You get this information for the price of the book. It will be the best bargain you have ever recieved if you use the information to prevent similar incidents at your plant."

There are numerous lessons here, even for people who aren't (petro)chemical engineers - I'm a C++ programmer. Guidance like "It is no use telling the operator to be more careful. We have to recognize that the possibility of an error - forgetting to open the valve - is inherent in the work situation. If we want to prevent an error, we must change the work situation. That is, change the design and/or the method of operation - the hardware and/or the software." is highly portable (just replace "open the valve" with "free the pointer" and there you go).

Still Going Wrong!: Case Histories Of Process Plant Disasters And How They Could Have Been Avoided is its sequel.

I created an account just to vote this one up.

A memorable anecdote concerned removing a dent from a storage tank. It had been filled with a warm vapor and accidentally sealed, and when the vapor cooled it reduced in volume, resulting in predictable (but slight) denting.

The plant engineer had to stop workers from attaching high pressure pumps to the tank, since that would have resulted in a burst tank. Instead, he attached a couple of feet of garden hose to a high point on the tank, and dribbled in water, which pushed out the dent.

Because, you know, hydraulics.

(At least, it blew my mind, man)

The Guns of August by Barbara Tuchman.

Subject: World War I.

Lessons: In the era before computers, you couldn't halt a war once the order was given because there were no pre-made logistical plans for halting a mobilization. The Germans over-tinkered with their battle plan and weakened it - if they'd stuck with their original plan they might have conquered France. Epic amounts of stupidity everywhere - it might have been the greatest parade of multisided folly in all human history.

Epic amounts of stupidity everywhere - it might have been the greatest parade of multisided folly in all human history.

It was an epic event. Did it have any higher a stupidity quotient than anything else at the time?

I suspect yes, given the actors and the stakes; but at any rate it may be one of the best-chronicled displays of multisided folly in human history.

[-]Alan120

"What I Learning Losing a Million Dollars" by Jim Paul and Brendan Moynihan (1994)

Subject: Analysis of catastrophic trading mistakes woven through the autobiography of a highly confident commodities trader. He made $250k in one day. Thereafter he went on to take greater risks in commodities markets, counting his profits before they were realized. At one point he considered renting a Concord jet to celebrate his imagined gains. Over the course of several months, however, due to a combination of misfortune, hubris , denial, and creative rationalization, his entire position was wiped out and he lost over $1 million.

This is the only non-quack popular book of its kind I know of which deconstructs the psychological mistakes a trader can make. The book is self-published, and until recently, was out of print. Sadly, one of the authors, Mr. Paul, perished in the attacks on the WTC. Not widely known about, the book is fortunately avalable once again.

The lessons contained in it are related not just to trading, but to life in general. Honest and unflinching in its analysis of self-deception, overconfidence, and how to guard against them. This book is unique.

The Black Swan by Nassim Nicholas Taleb.

Subject: The unexpected, and how people try and frequently fail spectacularly to predict and deal with it.

Example quote (from 2007): Globalization creates interlocking fragility, while reducing volatility and giving the appearance of stability. In other words it creates devastating Black Swans. We have never lived before under the threat of a global collapse. Financial Institutions have been merging into a smaller number of very large banks. Almost all banks are interrelated. So the financial ecology is swelling into gigantic, incestuous, bureaucratic banks – when one fails, they all fall. The increased concentration among banks seems to have the effect of making financial crisis less likely, but when they happen they are more global in scale and hit us very hard. We have moved from a diversified ecology of small banks, with varied lending policies, to a more homogeneous framework of firms that all resemble one another. True, we now have fewer failures, but when they occur ....I shiver at the thought.

Flawed, but well worth reading.

The Smartest Guys in the Room by McLean and Elkind.

Subject: Enron.

Lessons: Hiding from reality becomes an ever-downward-spiraling habit as you have to hide the results of failures from last time. Incentive structures are everything. Selecting for verbal smartness and sparkle can just get you spectacularly complicated, charming failure. Complexity is used to hide the truth from both outsiders and yourself.

Incentive structures are everything.

This has not been given enough attention in our current financial crisis. People were given huge bonuses if their big risks paid off, but not given huge penalties if they tanked. So it was rational for them to make investments that had negative expected return for the people whose money it actually was.

This is important because pundits say that our current crisis proves that the markets need more regulation, because relying on self-interest failed. Actually, self-interest succeeded. The markets need more intelligent regulation, not more regulation. People failed to foresee the collapse (if they did) because humans are highly skilled at hiding the truth from themselves when properly incentivized to do so.

[-]matt70

Incentive structures are everything. What incentives do the regulators face?

I don't buy this account of Enron, which has become the standard fable. There was a lot there that was more like straight-up fraud than smart people overcomplicating things and missing the down-home common sense.

Edit: I agree with the "hiding from reality = downward spiral" part strongly.

The book does represent that in the endgame it was spectacularly complicated, charming fraud. But they do seem to have slid into fraud more than having started with that directly in mind - at least so the book alleges. Complication eases the path into fraud, and other downward spirals.

I thought that the standard fable focused on the straight-up fraud of the endgame.

I focus on the incentives to the sales staff. As I understand it, their exit was well-timed, so the situation wasn't that hard to unravel. But this leaves the question of whether the guys at the top understood what was going on, let alone intended it.

The Logic of Failure, by Dietrich Doerner.

Summary: Doerner draws on historical and laboratory data to illustrate generic features of many/most (I'd say "all") failures of judgment in complex situations. He offers suggestions on how to overcome our predisposition to failure in these situations.

Lessons: This book is a treasure trove. It ranges broadly -- complexity, goals, models, time, ignorance, planning, and more. I can't emphasize enough how illuminating (and readable!) this book is.

Here are some quotes from my notes on the first half of the book: "When we fail to solve a problem, we fail because we tend to make a small mistake here, a small mistake there, and these mistakes add up." (p. 7), "...it is far from clear whether "good intentions plus stupidity" versus "evil intentions plus intelligence" have wrought more harm in the world." (p. 8), "If, the moment something goes wrong, we no longer hold ourselves responsible but push the blame onto others, we guarantee that we remain ignorant of the real reasons for poor decisions, namely inadequate plans and failure to anticipate the consequences." (p. 27), "We find an inability to think in terms of nonlinear networks of causation rather than chains of causation -- an inability, that is, to properly assess the side effects and repercussions of one's behavior. We find an inadequate understanding of exponential development, an inability to see that a process that develops exponentially will, once it has begun, race to its conclusion with incredible speed. These are all mistakes of cognition." (p. 33), "[Characteristics of analysis of complicated systems:] complexity, intransparence, internal dynamics" (p. 37), "...the ability to make allowances for incomplete and incorrect information and hypotheses is an important requirement for dealing with complex situations." (p. 42), "Formless collections of data about random aspects of a situation merely add to the situation's impenetrability and are no aid to decision making." (p. 44), "...goals may be: positive or negative, general or specific, clear or unclear, simple or multiple, implicit or explicit" (p. 52), "By labeling a bundle of problems with a single conceptual label, we make dealing with that problem easier -- provided we're not interested in solving it. Phrases like "urgently needed measures for combating unemployment" roll easily off the tongue if we don't have to do anything about unemployment." (p. 55), "There are many ways of tackling multiple problems at once, but the one thing we usually cannot do is solve all the problems at once." (p. 55), "Goals conflict with one another not by their very nature but because the variables relating to them in the system are negatively linked." (p. 57), "[The problems of DDT (e.g.)] So the mistake is less not knowing than not wanting to know. And not wanting to know is a result not of ill will or egoism but of thinking that focuses on an immediately acute problem." (p. 58)

The Rise and Fall of the Third Reich by William L. Shirer

Subject: World War II.

Lessons: One lunatic in the wrong place and the wrong time really can change the whole course of history. Hitler was in many ways incredibly stupid, but got as far as he did because other players expected him to be rational and therefore not attack when he was bound to lose eventually (e.g. Stalin). WW2 shaped later reactions to things like Korea and Vietnam; they hadn't stopped Hitler early so this time they determined to start fighting as early as possible.

expected him to be rational and therefore not attack when he was bound to lose eventually

But that could have been his strategy...

when he was bound to lose eventually

Judging by Churchill's writings the loose eventually thing was far from sure at certain times in the war, though you might be talking about very specific episodes. I also came away with an impression from Churchill that Hitler consciously acted contrary to "rational" expectations. Though I cannot support it with a quote, I believe that Churchill was (rightly) credited with early insights about Hitler's danger because he picked up on this strategy and was able to see that it would work on the civilized Europeans.

Richard Feynman's experiences investigating the Space Shuttle Challenger explosion are very, very good reading: http://en.wikipedia.org/wiki/What_Do_You_Care_What_Other_People_Think%3F

http://www.feynman.com/ Has additional information including his final official report on the matter.

To Engineer Is Human: The Role of Failure in Successful Design, by Henry Petroski

Summary: Petroski takes us through a bunch of conspicuous engineering failures throughout history and describes the technical and sociological solutions that followed.

Lessons: Engineering runs through alternating cycles in history where periods of innovation, new ideas/materials, and speed result in disasters (the Tacoma Narrows bridge), followed by periods of conservatism and overbuilding (the Firth of Forth).

I think the best cautionary tale in this book is the 1981 collapse of the skywalk in the Hyatt Regency in Kansas City from what was thought an extremely minor design change. In any event, the book addresses lots of failures and is well-written and readable.

[-]sgr60

A classic book on failures is Levy & Salvadori's Why Buildings Fall Down: How Structures Fail (ca 1992).

They review a variety of bridges, buildings, dams and other objects with exciting failure modes. Remarkably, they manage to be respectful of the regrettable loss of life while also being kind of funny. For example, the classic film of the failure of the Tacoma Narrows bridge shown to physics undergrads is hilarious when you watch Professor Whatsisname walk down the nodal line; it's somewhat less funny when you realize the degree of danger to his life.

An architect friend once claimed to me that under Hammurabic law, if a building fell down and killed somebody, the architect was killed too -- and that this led to modern architecture firms being partnerships instead of corporations, with personal liability for the architect when he puts his seal on plans. Another friend, whose credibility I know less about, asserted that in Roman times when building an arch the engineer was required to stand underneath it as the construction scaffolding and trusses were removed.

I can't verify the stories, and don't approve of the brutality involved in any case. But a high degree of personal involvement with the consequences of failure does perhaps inspire some degree of meticulousness, and perhaps solicitation of peer review. ("Hey, Fred! I gotta stand under this bridge. Does it look right to you?")

This "put your quality control where your mouth is" approach seems to be quite common in history. I remember reading somewhere (and take this with the level of credibility deserved of sentences beginning with "I remember reading somewhere"), that in the English Civil War all breastplates had a 'proof mark', which was the dent made by firing a pistol shot at the armour at close range, while it was worn by the armourer.

Edit: This may well be the origin of the term 'bullet-proof'

That's true. From the Code:

If a builder builds a house for someone, and does not construct it properly and the house falls in and kills its owner, then that builder shall be put to death. If it ruins goods, he shall make compensation for them, and shall re-erect the house at his own expense.

Psychology of Intelligence Analysis, by Richards Heuer

Summary: Intelligence analysts cope with the need to aggregate heterogeneous data of wildly varying quality into coherent decision-making frameworks. This book catalogues historical failures made in this sort of analytical context and suggests how to overcome common traps.

Lessons: This is a great guide from a sharp practitioner. Analysts wanting to draw conclusions from data should generate a wide range of hypotheses and use falsification to narrow down to a smaller range of options. Such an approach avoids the trap of establishing a single theory early in the analytical process and only paying attention to evidence that confirms it.

Note: I spoke with the author a few months ago, and he said a dramatically revised edition was in the works. I'm drawing heavily on this book for two chapters of my forthcoming book, Head First Data Analysis (O'Reilly Media).

Here's a free PDF of the whole thing: https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/index.html

Since we are recommending books, would it make sense to use Amazon Referral Links? There must be some cough nonprofit we would like to benifet from the referral fees. And I imagine it would not be too hard for the site to automatically add a referral code to links to Amazon submitted in comments.

tall ships down

subject: the final voyages and sinkings of five large sailing vessels

big lessons: 1) if you have a weakness (personally, in your competence or temperament, or structurally, in your vessel or equipment) and spend enough time in an unpredictable environment, it will eventually be exploited. 2) the fact that an environment is unpredictable does not relieve you from the responsibility of considering risks and working to minimize them. 3) we often have an inkling about our weaknesses, but if we've gotten by so far without major incident, we see no pressing need to address them. 4) if you're the captain/leader of an operation, know what the hell you're doing. if you're the equivalent of an ordinary seaman, make it a priority to become competent enough to identify a leader who doesn't know what the hell s/he's doing.

Not to focus exclusively on markets, and it's not a book, but this page of a Michael Lewis piece from Portfolio.com is a crucial read (and short!):

http://www.portfolio.com/news-markets/national-news/portfolio/2008/11/11/The-End-of-Wall-Streets-Boom?page=9#page=9

I was shocked to learn that when the big Wall Street firms all became public companies, which is arguably the most important precursor event to the current unpleasantness, many people knew this was a terrible idea!

There are at least two parts to the failure message: think carefully about the public company structure and how it changes your institutions' incentives; and listen to the wise.

The rest of the piece is entertaining but not on point to the current discussion and not a must-read.

many people knew this was a terrible idea!

I'm concerned about hindsight. Lewis only says that the old guard objected, but he doesn't make it entirely clear why; "moral disapproval" doesn't suggest consequentialism.

Lewis certainly does suffer from hindsight. The article claims that he was "waiting for the end of Wall Street," while his book says "I didn't think Wall Street would collapse."

Legacy of Ashes: The History of the CIA by Tim Weiner

Summary: A history of the CIA from its origins in US military intelligence units up to right after 9/11. Based only on primary sources (released CIA documents, interviews with insides, etc.; no rumors or unconfirmed theories), Weiner shows how the CIA has routinely failed to meet its goals and has often made bad situations worse. All of this despite filling the organization with intelligent men ready to die to defend their country by providing military intelligence to the president and engaging in covert operations.

Lessons: There are many small lessons you can learn from the episodes of the history, but the big lesson you'll find is that when an organization has a culture of secrecy, to the point that not even the people in charge with the highest clearance have access to all the secrets, you open the door for mismanagement and abuse. But because the CIA was designed to be opaque, even to the president, it had no accountability or oversight, became inbred with bad ideas, and began to operate independently of US policy by receiving money from Congress and hiding its operations, even from units within itself. The CIA is a case study of the tremendous damage that a few men with power, money, and secrecy can do, where only luck prevented what could have been disastrous consequences.

A Demon Of Our Own Design by Richard Bookstaber

This book came out two years ago but reads like it was just written.

The lesson is similar to e.g. When Genius Failed but it gets into the grit a bit more and is more detailedly insightful about markets. Bookstaber has worked in markets rather than just writing about them, and it shows.

Dull in spots, but that's fairly standard in books like this, since you can't sell a 130-page book.

[-]gjm40

When genius failed: the rise and fall of Long-Term Capital Management by Roger Lowenstein.

Subject: How a hedge fund with two Nobel-calibre alleged experts in the mathematical modelling of asset pricing won big in the short term and imploded disastrously in the slightly longer term.

Lessons: Sophisticated but semi-detached mathematical models can lead to overconfidence and get you absolutely destroyed. (This applies no matter how clever they are.) A strategy that notionally wins on average, in the long term, may still lead to disaster in the shorter term. Markets are anti-inductive.

See also: The smartest guys in the room, recommended by Eliezer.

Missing word or link: "Markets are ???"

[-]gjm00

Whoa, thanks. I missed out the aitch tee tee pee colon slash slash from the URL part of the link, and the result is that the whole thing didn't show at all.

Collapse: How Societies Choose to Fail or Succeed by Jared Diamond

Subject: What causes civilizations to collapse, and how to prevent it.

Lessons: Collapse is usually caused by a civilization exhausting its natural resources. Examples include Easter Island, Norse Greenland, and modern Rwanda. There's more, but it's my father's book, and not mine.

[-]Roko40

I didn't like this book. The whole book focuses on societies that are obviously very different from our own - orders of magnitude smaller, and having no scientific understanding of their world. Then it attempts to carry lessons over.

I remember liking the history in the book, but I think you have a point there. However, remember that we're not all so lucky as to live in a society that is so different from the ones described, so I think this book has real currency for some people (just probably not the ones who can get a copy to read).

His other big book, Guns, Germs, and Steel, teaches a related lesson about failure: if you aren't lucky enough to have the necessary resources, no matter how hard you try, your success will be limited.

The March of Folly: From Troy to Vietnam by Barbara Tuchman.

What Made Gertie Gallop? : Learning From Project Failures by O. P. Kharbanda and Jeffrey K. Pinto

The projects reviewed here are old enough that they have been analyzed well enough for fairly complete understanding to be possible. The mega-scale of the projects makes them less than directly applicable for most readers, but their large scale makes for a completeness in their management, smaller projects frequently skimp on their formal management, and more thorough documentation, that makes for a better analysis.

The failures are well illustrated by the projects chosen and the writing does not get in the way of the analyses. This book is very clearly written, the individual project analyses can almost be read like short stories, but with the added benefit of being factual. There are all sorts of failure modes discussed, failures of structural understanding, failures of planning, of political support, of lack of market demand, and more.

[-]jhl20

The Best and the Brightest (1972) - David Halberstam

Amazon Wikipdia

from Wikipedia: The Best and the Brightest (1972) is an account by journalist David Halberstam of the origins of the Vietnam War. The focus of the book is on the foreign policy crafted by the academics and intellectuals who were in John F. Kennedy's administration, and the consequences of those policies in Vietnam.


One distinction made by the book is the difference between acting based on reasoning or based on principles. The men acting based on reasoning got good results, but in times of high stress or confusion they made bad decisions.

[-]wms20

Albert Speer's autobiography, Inside the Third Reich.

Under Hitler's guidance, Speer pioneered the "ruin value" ideology in architecture - thus in a way focusing his trade on the question of how to fail with magnificence.

His attempts to build in a Berlin of the distant future an exponential replication of the ruins of the Roman empire was either never actualized or bulldozed by the allies. Today all that remains of his attempt to harness failure is a row of lampposts on a side street in Berlin.

Judgement Day by Nathaniel Branden. (I think this is the same book as My Years With Ayn Rand.)

Hero becomes Ayn Rand's closest confidant, co-builder of Objectivism, lover; gets drummed out.

Message: every cause wants to be a cult; the enormous power of personal charisma; an excellent antidote for the recalcitrant Objectivist in your life.

Inventing Money, by Nicholas Dunbar

Subject: The failure of Long Term Capital Management. See also gjm's recommendation of Lowenstein, but Dunbar supplies a briefer and more financially sophisticated treatment.

Lessons: The danger of leverage, which is one potentially very expensive form of overconfidence. The world contains more risks than anyone, no matter how clever, can think of to hedge against. LTCM, contrary to popular belief, did in fact hedge against Russia defaulting on its own bonds, through currency forward contracts. What they failed to hedge against was Russia then suspending payment on the forward contracts, and hiding its hard currency.

I disagree strongly that "the danger of leverage" is the key message of LTCM. The key message for rationalists has to do with the subtle nature of correlation; the key message for risk managers has to do with the importance of liquidity.

"The danger of leverage" also isn't much of a message. That's like saying the message of the Iraq War is "the danger of foreign policy."

Does "popular belief" hold that LTCM wasn't hedged? Anyone who believes that is very far from learning anything from the LTCM story.

Overconfidence leads one to seek leverage. Leverage in turn magnifies the consequences of errors. Leverage on LTCM's scale, which dwarfed that of its predecessors, turns errors into catastrophes.

Another, probably more important lesson concerns the degree to which one should rely on historical data. If something never has happened, or at least not recently, one cannot safely bet the house that it never will happen.

I probably shouldn't have speculated on what "popular belief" does or does not hold, but it has often been written, falsely, that LTCM was unhedged against Russia defaulting on its own bonds.

The danger I see here is being exposed on a deal with a party who has a gun to your head.

What are the best books about the fall of the Roman Empire?

This seems like one of the most important failures in history, and most important to understand.

I just started Peter Heather's book, but haven't read enough to say anything except that the writing is a bit clunky.

[-]MrHen-10

Non-fiction only?

The Titanic is a famous story of failure. I don't know what books to read about it, though, or what lessons to learn from it that we don't already know.

In a very concise and abstract form: http://despair.com/mis24x30prin.html