This is a question from a workshop after the Global Catastrophic Risks conference.  The rule of the workshop was that people could be quoted, but not attributed, so I won't say who observed:

"The problem is that it's often our smartest people leading us into the disasters.  Look at Long-Term Capital Management."

To which someone else replied:

"Maybe smart people are just able to work themselves up into positions of power, so that if damage gets caused, the responsibility will often lie with someone smart."

Since we'd recently been discussing complexity, interdependence and breakdowns, the first observation that came to my own mind was the old programmers' saying:

"It takes more intelligence to debug code than to write it.  Therefore, if you write the most difficult code you can create, you are not smart enough to debug it."

(This in the context of how increased system complexity is a global risk and commons problem; but individuals have an incentive to create the "smartest" systems they can devise locally.)

There is also the standard suite of observations as to how smart people can become stupid:

  • You become overly skilled at defending beliefs you arrived at for unskilled reasons;
  • Success on comparatively easy problems in the past, leads to overconfidence on more difficult problems in the future;
  • Because most of the advice offered you comes from people (apparently) not as smart as you, you become intellectually isolated;
  • You spend too much time as the smartest person in the room and can't handle it emotionally;
  • Because you enjoy looking smart, you avoid trying new and difficult things where you might not perform as well, and so become very narrow;
  • Because you enjoy looking smart, you don't like confessing your mistakes or realizing your losses;
  • Because people praise you for being good at math or something, you assume you are already wise, and you don't apply your intelligence to becoming wiser in other areas.

But I also think we should strongly consider that perhaps the "highly intelligent" sponsors of major catastrophes are not so formidable as they appear - that they are not the truly best and brightest gone wrong; only the somewhat-competent with luck or good PR.  As I earlier observed:  Calling the saga of the fall of Enron, "The Smartest Guys in the Room", deserves an award for Least Appropriate Book Title.  If you want to learn what genius really is, you probably should be learning from Einstein or Leo Szilard, not from history's flashy failures...

Still, it would be foolish to discard potential warnings by saying, "They were not really smart - not as smart as me."  That's the road to ending up as another sponsor of catastrophe.

New Comment
19 comments, sorted by Click to highlight new comments since:

A good summary, thanks. :)

The title "The Smartest Guys In The Room" is supposed to be ironic...

Szilard may have been the wisest guy in the room, but he was no Einstein in terms of physics ability, and in a slightly different timeline where MAD broke down his "wisdom" could easily have lead him to be "the smartest guy in the room" on a much larger scale. In terms of something like general intelligence (though I'm reluctant to identify it with 'g') Einstein wasn't really that astounding either. Hands down the best physicist ever and possibly the best metaphysicist, but low-average for a theoretical physicist in terms of math ability and academic politics (hence the patent office). Without some more academically skilled geniuses like Planck noticing the quality of his work and drawing attention to it, which was by no means a sure thing, he could have died in obscurity. As an author and a political/ethical philosopher, he was surely better than most theoretical physicists, but not obviously better than many other theoretical physicists who turn their attention to those domains. A few famous people, for instance Feynman and Von Neumann, do seem to have astounding general abilities but this doesn't make their contributions to science comparable to his, while most people with astounding IQ scores don't seem very accomplished or wise at all, even after regression to the mean in both measurement and recognition is taken into account.

Of the pitfalls of intelligence that Eliezer mentions, I worry most about the second, third, fourth, and sixth as applied to him and to Robin and about the second and fifth as applied to myself. However, far and away the greatest pitfall of intelligence seems to me to be the way in which it makes "we evolved to predict Other Minds by putting ourselves in their shoes, asking what we would do in their situations; for that which was to be predicted, was similar to the predictor." a counterproductive strategy. Worse still, this strategy is advocated as both functional and normative by all cultural institutions, and information that could be used to modify it is somewhat suppressed.

Other problems of the highly intelligent seem to include nativity, for whatever reason, possibly a result of the childlikeness that encourages the absorption of information, especially explicit information. Likewise, one tends to find overreaction to explicit social norms. Social indoctrination of all sorts is calibrated to match a the typical mean of reaction of the population in shifting behavioral patterns to be more socially beneficial. However, acculturation also depends on the absorption of implicit norms, a process that is less accelerated by 'g'. As a result, explicit rules frequently override implicit ones for highly intelligent people in a manner which disfavors them, especially in social and competitive contexts (and most especially where it is important to quickly create admiration, envy or trust). A prominent example is the economic effect of drinking. http://www.stat.columbia.edu/~cook/movabletype/archives/2006/09/drink_to_succes.html http://www.reason.org/news/alcohol_use_091406.shtml http://papers.nber.org/papers/w12529

BTW, I have significant personal experience with one of "the smartest guys in the room" and yes, they were (or at least he is) VERY smart by any normal business world standards. He's particularly great at giving obvious in retrospect answers to marketing type problems. It would be a somewhat unusual room full of business people or other social elites where the guy I'm thinking of isn't the smartest guy.

[-]Aron00

the debugging quote is silly. Unless I missed the point, it is considerably more clear and just as instructive to say 'it's easier to write broken code than correct code, therefore if broken code is the best you can do...'

I don't think so. The point is that the smartest people in the world built the world by being as smart as they could. Thus, those people are not equipped to fix the world they built.

You can also replace, "Because you enjoy looking" with "Because you have to look" for many high-power jobs and positions. Dominance-Submission relationships in business and politics are very important to outcome. I would guess that a lot of bad decisions are made because of the necessity of this dance at high levels... how to crush it out? Not easy. Human nature, it would seem.

michael,

those with high intelligence do not have different goals than those with low intelligence. their mental model of reality more reliably captures the nuanced causal complexity of the real-world, and the strategies they implement in pursuit of their goals will therefore possess greater sophistication, but their medium and long-term objectives are of the same type as the medium and long-term ambitions of the less intelligent. everyone is satisficing a similar set of emotional desires. i won't deny that there is a bidirectional feedback loop between higher-order cognition and those emotions, but there is a limit to the extent to which they can be altered.

naive perspective-shifting is unhelpful and counterproductive; perspective-shifting that adjusts for cognitive ability is very helpful and instructive.

also, im pretty sure 'real g' should accelerate the rate at which implicit knowledge is absorbed but i understand your points. good post.

Re: the debugging quote is silly

The original quote seems sensible enough:

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. - Brian Kernighan

Tim, that quote makes sense only with a broken debugging cycle.

If you write your code bottom-up and debug it as you go, then it simply doesn't make sense to talk about writing clever code you can't debug. If you can't debug it, then it wasn't clever in the first place -- it was just something that didn't make sense.

Taking it literally, this is like saying "Writing a novel is easy. Rewriting it so it isn't completely boring worthless drivel is harder."

Still, there's a worthwhile thought in it. If you try to do something clever that you don't completely understand, it's likely to be easy to make mistakes.

Once, for a programming assignment in school, I wrote code that I knew would work, even though, after reading the code that I wrote, I didn't completely understand how it worked. (And, yes, it did work, even though I couldn't unravel all the loops and the levels of recursion I put into it.)

Once, for a programming assignment in school, I wrote code that I knew would work
That's fine. The trick is ensuring that the inner conviction matches the reality. If you'd written the code, were sure that it would work, but it didn't, what would you have concluded?

In another thread, Eliezer addressed a question to another poster: what do autistic-types know about normal people that normal people don't know about themselves? There's a very general answer: people with unusual minds know not to trust the intuitions that normal people never think to question.

Yes.

Don't trust your intuition, don't trust your emotions, don't trust your memory, don't trust your senses.

They all lie to you.

To a greater or lesser degree, for sure... but it's hard to see your own filters, and you especially can't trust that voice that tells you "I have no filters".

Don't trust your intuition, don't trust your emotions, don't trust your memory, don't trust your senses.

But act, largely, based on what they tell you.

Well yes. Our input is heavily filtered, but it's still the only input we get - so it's all we have to act upon.

The best we can do is to become more aware of its limitations (eg our biases) and try to compensate for them as much as possible.

"reasoning under uncertainty" and all that :)

If you can't explain how your code works, how can somebody else maintain it? Do you want to get stuck with the job of maintaining your old code? Every time a change is needed, do you want to figure it out over again well enough to get the change to work?

I once had a student who asked me to help him debug some of his code. I looked at it. "Why do you need these variables here?" "I like them. They make the code look more like Pascal." "Why do you use this pointer when you never change it?" "I might want to change it someday." "This routine is 16 lines long. Why not divide it up into a bunch of simple routines, none of them more than two lines?" "If it looked simple, anybody could understand it. I want to write it so that everybody can see I'm real smart." "Then simplify it until it works, and then add an the complications later."

His career has been far more successful than mine. Maybe he was doing the right thing.

[-]Aron00

re: Debugging is twice as hard as writing the code in the first place.

Yeah - still not a fan of this quote. It relies on 'writing the code in the first place' meaning something of any importance. I could say that my 12 line strong AI program is written and just needs to be debugged a little... given these terms.

Sounds more like an entropy argument where incorrect code is much more highly probable than correct code, and while our caffeine and cool ranch dorito fueled minds may get into the neighborhood of order it only comes at great cost: every bug we move out of code, ends up munching in the dorito bag we left under the table several weeks before.

It seems the fundamental point, or a way to better define 'writing code' is to say that you can write code that works for some input/outputs much more easily than for ALL input/outputs. Therefore, Skilling et al can create a house of cards that works for some time but once reality random walks to something unanticipated then someone is left shuffling out the door.

in my experience it's debugging other ppl's code that is hard. if you can't debug your own code then you shouldn't be programming.

[-]ata70
  • Success on comparatively easy problems in the past, leads to overconfidence on more difficult problems in the future;
  • Because people praise you for being good at math or something, you assume you are already wise, and you don't apply your intelligence to becoming wiser in other areas.

Those, in retrospect, were some of the biggest (intellectual) problems I had growing up. I received way too much praise for superficially but not technically impressive achievements, and for things things that were legitimately impressive for my age but which should have suggested that I could have been getting even better. (Not that I'm putting all the blame on the people who were praising me — I should have noticed myself that I could have been learning and improving faster — though since I'm a human with status motivations, I suspect that I mainly just didn't feel a strong drive to improve when the people around me already talked about me like I had magical superpowers, and I had no real intellectual role models to reach upward towards.) I wish I had read "Tsuyoku Naritai" when I was 7. (Well... actually I wish I had read a lot of things from LW and elsewhere as a child.) I often wonder how far my brain might have gotten by now if it had grown up under other circumstances.