"The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress. So when Martin Bromiley read the Harmer report, an incomprehensible event suddenly made sense to him. “I thought, this is classic human factors stuff. Fixation error, time perception, hierarchy.”

It’s a miracle that only ten people were killed after Flight 173 crashed into an area of woodland in suburban Portland; but the crash needn’t have happened at all. Had the captain attempted to land, the plane would have touched down safely: the subsequent investigation found that the landing gear had been down the whole time. But the captain and officers of Flight 173 became so engrossed in one puzzle that they became blind to the more urgent problem: fuel shortage. This is called “fixation error”. In a crisis, the brain’s perceptual field narrows and shortens. We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else. It’s an affliction to which even the most skilled and
experienced professionals are prone..."

I don't believe that I've heard fixation error or time perception mentioned on Less Wrong. The field of human factors may be something worth looking into more.

 

New Comment
5 comments, sorted by Click to highlight new comments since:

The links missing from the OP, for the lazy: Flight 173, the article being quoted,

When a plane reports a problem, I assume that there is a bunch of data being sent out to FAA controllers, and that basically the plane suddenly shows up as a big problem in the control room.

So basically I am wondering why it is the crew's responsibility to fix any problem that shows up? I mean it would seem that these kind of fast diagnosis problems are best solved by a control room with a bunch of engineers, AI software, etc. The crew's role should be minimal (I'm of course assuming good network connectivity, but that seems reasonable at this point).

I don't recall fixation error in the sequences, but something like it shows up in HPMOR, occurring in http://hpmor.com/chapter/67 and discussed in http://hpmor.com/chapter/68:

"But you, Miss Granger, had the misfortune to remember how to cast the Stunning Hex, and so you did not search your excellent memory for a dozen easier spells that might have proved efficacious."

I've delved into this literature a bit while researching a (currently shelved) paper on automation-associated error, and I agree with the title of this post!

This is called “fixation error”.In a crisis, the brain’s perceptual field narrows and shortens. We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else.

Is there a less time critical version of that?

I've always felt rather obsessed with whatever I considered the top problem on the list.