Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Strategic ignorance and plausible deniability

37 Kaj_Sotala 10 August 2011 09:30AM

This is the third part in a mini-sequence presenting material from Robert Kurzban's excellent book Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind.

The press secretary of an organization is tasked with presenting outsiders with the best possible image of the organization. While they're not supposed to outright lie, they do use euphemisms and try to only mention the positive sides of things.

A plot point in the TV series West Wing is that the President of the United States has a disease which he wants to hide from the public. The White House Press Secretary is careful to ask whether there's anything she needs to know about the President's health, instead of whether there's anything she should know. As the President's disease is technically something she should know but not something she needs to know, this allows the President to hide the disease from her without lying to her (and by extension, to the American public). As she then doesn't need to lie either, she can do her job better.

If our minds are modular, critical information can be kept away from the modules that are associated with consciousness and speech production. It can often be better if the parts of the system that exist to deal with others are blissfully ignorant, or even actively mistaken, about information that exists in other parts of the system.

In one experiment, people could choose between two options. Choosing option A meant they got $5, and someone else also got $5. Option B meant that they got $6 and the other person got $1. About two thirds were generous and chose option A.

A different group of people played a slightly different game. As before, they could choose between $5 or $6 for themselves, but they didn't know how their choice would affect the other person's payoff. They could find out, however – if they just clicked a button, they'd be told whether the choice was between $5/$5 and $6/$1, or $5/$1 and $6/$5. From a subject's point of view, clicking a button might tell them that picking the option they actually preferred meant they were costing the other person $4. Not clicking meant that they could honestly say that they didn't know what their choice cost the other person. It turned out that about half of the people refused to look at the other player's payoffs, and that many more subjects chose $6/? than $5/?.

There are many situations where not knowing something means you can avoid a lose-lose situation. If know your friend is guilty of a serious crime and you are called to testify in court, you may either betray your friend or commit perjury. If you see a building on fire, and a small boy comes to tell you that a cat is caught in the window, your options are to either risk yourself to save the cat, or take the reputational hit of neglecting a socially perceived duty to rescue the cat. (Footnote in the book: ”You could kill the boy, but then you've got other problems.”) In the trolley problem, many people will consider both options wrong. In one setup, 87% of the people who were asked thought that pushing a man to the tracks to save five was wrong, and 62% said that not pushing him was wrong. Better to never see the people on the tracks. In addition to having your reputation besmirched by not trying to save someone, many nations have actual ”duty to rescue” laws which require you to act if you see someone in serious trouble.

In general, people (and societies) often believe that if you know about something bad, you have a duty to stop it. If you don't know about something, then obviously you can't be blamed for not stopping it. So we should expect that part of our behavior is designed to avoid finding out information that would impose an unpleasant duty on us.

I personally tend to notice this conflict if I see people in public places who look like they might be sleeping or passed out. Most likely, they're just sleeping and don't want to be bothered. If they're drunk or on drugs, they could even be aggressive. But then there's always the chance that they have some kind of a condition and need medical assistance. Should I go poke them to make sure? You can't be blamed if you act like you didn't notice them, some part of me whispers. Remember the suggestion that you can fight the bystander effect by singling out a person and asking them directly for help? You can't pretend you haven't noticed a duty if the duty is pointed out to you directly. As for the bystander effect in general, there's less of a perceived duty to help if everyone else ignores the person, too. (But then this can't be the sole explanation, because people are most likely to act when they're alone and there's nobody else around to know about your duty. The bystander effect isn't actually discussed in the book, this paragraph is my own speculation.)

The police may also prefer not to know about some minor crime that is being committed. If it's known that they're ignoring drug use (say), they lose some of their authority and may end up punished by their superiors. If they don't ignore it, they may spend all of their time doing minor busts instead of concentrating on more serious crime. Parents may also pretend that they don't notice their kids engaging in some minor misbehavior, if they don't want to lose their authority but don't feel like interfering either.

In effect, the value of ignorance comes from the costs of others seeing you know something that puts you in a position in which you are perceived to have a duty and must choose to do one of two costly acts – punish, or ignore. In may own lab, we have found that people know this. When our subjects are given the opportunity to punish someone who has been unkind in an economic game, they do so much less when their punishment won't be known by anyone. That is, they decline to punish when the cloak of anonymity protects them.

The (soon-to-expire) ”don't ask, don't tell” policy of the United States military can be seen as an institutionalization of this rule. Soldiers are forbidden from revealing information about their sexuality, which would force their commanders to discharge them. On the other hand, commanders are also forbidden from inquiring into the matter and finding out.

A related factor is the desire for plausible deniability. A person who wants to have multiple sexual partners may resist getting himself tested for sexual disease. If he was tested, he might find out he had a disease, and then he'd be accused of knowingly endangering others if he didn't tell them about his disease. If he isn't tested, he'll only be accused of not finding out that information, which is often considered less serious.

These are examples of situations where it's advantageous to be ignorant of something. But there are also situations where it is good to be actively mistaken. More about them in the next post.

Bystander Apathy

23 Eliezer_Yudkowsky 13 April 2009 01:26AM

The bystander effect, also known as bystander apathy, is that larger groups are less likely to act in emergencies - not just individually, but collectively.  Put an experimental subject alone in a room and let smoke start coming up from under the door.  75% of the subjects will leave to report it.  Now put three subjects in the room - real subjects, none of whom know what's going on.  On only 38% of the occasions will anyone report the smoke.  Put the subject with two confederates who ignore the smoke, and they'll only report it 10% on the time - even staying in the room until it becomes hazy.  (Latane and Darley 1969.)

On the standard model, the two primary drivers of bystander apathy are:

  • Diffusion of responsibility - everyone hopes that someone else will be first to step up and incur any costs of acting.  When no one does act, being part of a crowd provides an excuse and reduces the chance of being held personally responsible for the results.
  • Pluralistic ignorance - people try to appear calm while looking for cues, and see... that the others appear calm.

Cialdini (2001):

Very often an emergency is not obviously an emergency.  Is the man lying in the alley a heart-attack victim or a drunk sleeping one off?  ...  In times of such uncertainty, the natural tendency is to look around at the actions of others for clues.  We can learn from the way the other witnesses are reacting whether the event is or is not an emergency.  What is easy to forget, though, is that everybody else observing the event is likely to be looking for social evidence, too.  Because we all prefer to appear poised and unflustered among others, we are likely to search for that evidence placidly, with brief, camouflaged glances at those around us.  Therefore everyone is likely to see everyone else looking unruffled and failing to act.

Cialdini suggests that if you're ever in emergency need of help, you point to one single bystander and ask them for help - making it very clear to whom you're referring.  Remember that the total group, combined, may have less chance of helping than one individual.

continue reading »