childofbaud comments on Learned Blankness - Less Wrong

130 Post author: AnnaSalamon 18 April 2011 06:55PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

You are viewing a single comment's thread.

Comment author: childofbaud 20 April 2011 08:48:31PM 6 points [-]

I have observed similar behavior in others. Only I called it 'blackboxing', for lack of a better word. I think this might actually be a slightly better term than 'learned blankness', so I hereby submit it for consideration. It's borrowed from the software engineering idea of a black box abstraction.

People tend to create conceptual black boxes around certain processes, which they are remarkably reluctant to look within and explore, even when something does go wrong. This is what seems to have happened with the dishwasher incident. The dishwasher was treated as a black box. Its input was dirty dishes, its output was clean ones. When it malfunctioned, it was hard to see it as anything else. The black box was broken.

Of course, engineers and programmers often go out of their way to design highly opaque black boxes, so it's not surprising that we fall victim to this behavior. This is often said to be done in the name of simplicity (the 'user' is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high. Throwing out a broken dishwasher and buying a new one is far more profitable to a manufacturer than making it easy for the users to pick it apart and fix it themselves.

The open source movement is one of the few prominent exceptions to this that I know of.

Comment author: TeMPOraL 22 April 2013 01:16:06PM 3 points [-]

This is often said to be done in the name of simplicity (the 'user' is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high.

There's also one much more important reason. To quote A. Whitehead,

Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Humans (right now) just don't have enough cognitive power to understand every technology in detail. If not for the black boxes, one couldn't get anything done today.

The real issue is, whether we're willing to peek inside the box when it misbehaves.