Followup to: Torture vs. Dust Specks, Zut Allais, Rationality Quotes 4
Suppose that a disease, or a monster, or a war, or something, is killing people. And suppose you only have enough resources to implement one of the following two options:
- Save 400 lives, with certainty.
- Save 500 lives, with 90% probability; save no lives, 10% probability.
Most people choose option 1. Which, I think, is foolish; because if you multiply 500 lives by 90% probability, you get an expected value of 450 lives, which exceeds the 400-life value of option 1. (Lives saved don't diminish in marginal utility, so this is an appropriate calculation.)
"What!" you cry, incensed. "How can you gamble with human lives? How can you think about numbers when so much is at stake? What if that 10% probability strikes, and everyone dies? So much for your damned logic! You're following your rationality off a cliff!"
Ah, but here's the interesting thing. If you present the options this way:
- 100 people die, with certainty.
- 90% chance no one dies; 10% chance 500 people die.
Then a majority choose option 2. Even though it's the same gamble. You see, just as a certainty of saving 400 lives seems to feel so much more comfortable than an unsure gain, so too, a certain loss feels worse than an uncertain one.
You can grandstand on the second description too: "How can you condemn 100 people to certain death when there's such a good chance you can save them? We'll all share the risk! Even if it was only a 75% chance of saving everyone, it would still be worth it - so long as there's a chance - everyone makes it, or no one does!"
You know what? This isn't about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain's feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn't even a feather in the scales, when a life is at stake. Just shut up and multiply.
Previously on Overcoming Bias, I asked what was the least bad, bad thing that could happen, and suggested that it was getting a dust speck in your eye that irritated you for a fraction of a second, barely long enough to notice, before it got blinked away. And conversely, a very bad thing to happen, if not the worst thing, would be getting tortured for 50 years.
Now, would you rather that a googolplex people got dust specks in their eyes, or that one person was tortured for 50 years? I originally asked this question with a vastly larger number - an incomprehensible mathematical magnitude - but a googolplex works fine for this illustration.
Most people chose the dust specks over the torture. Many were proud of this choice, and indignant that anyone should choose otherwise: "How dare you condone torture!"
This matches research showing that there are "sacred values", like human lives, and "unsacred values", like money. When you try to trade off a sacred value against an unsacred value, subjects express great indignation (sometimes they want to punish the person who made the suggestion).
My favorite anecdote along these lines - though my books are packed at the moment, so no citation for now - comes from a team of researchers who evaluated the effectiveness of a certain project, calculating the cost per life saved, and recommended to the government that the project be implemented because it was cost-effective. The governmental agency rejected the report because, they said, you couldn't put a dollar value on human life. After rejecting the report, the agency decided not to implement the measure.
Trading off a sacred value (like refraining from torture) against an unsacred value (like dust specks) feels really awful. To merely multiply utilities would be too cold-blooded - it would be following rationality off a cliff...
But let me ask you this. Suppose you had to choose between one person being tortured for 50 years, and a googol people being tortured for 49 years, 364 days, 23 hours, 59 minutes and 59 seconds. You would choose one person being tortured for 50 years, I do presume; otherwise I give up on you.
And similarly, if you had to choose between a googol people tortured for 49.9999999 years, and a googol-squared people being tortured for 49.9999998 years, you would pick the former.
A googolplex is ten to the googolth power. That's a googol/100 factors of a googol. So we can keep doing this, gradually - very gradually - diminishing the degree of discomfort, and multiplying by a factor of a googol each time, until we choose between a googolplex people getting a dust speck in their eye, and a googolplex/googol people getting two dust specks in their eye.
If you find your preferences are circular here, that makes rather a mockery of moral grandstanding. If you drive from San Jose to San Francisco to Oakland to San Jose, over and over again, you may have fun driving, but you aren't going anywhere. Maybe you think it a great display of virtue to choose for a googolplex people to get dust specks rather than one person being tortured. But if you would also trade a googolplex people getting one dust speck for a googolplex/googol people getting two dust specks et cetera, you sure aren't helping anyone. Circular preferences may work for feeling noble, but not for feeding the hungry or healing the sick.
Altruism isn't the warm fuzzy feeling you get from being altruistic. If you're doing it for the spiritual benefit, that is nothing but selfishness. The primary thing is to help others, whatever the means. So shut up and multiply!
And if it seems to you that there is a fierceness to this maximization, like the bare sword of the law, or the burning of the sun - if it seems to you that at the center of this rationality there is a small cold flame -
Well, the other way might feel better inside you. But it wouldn't work.
And I say also this to you: That if you set aside your regret for all the spiritual satisfaction you could be having - if you wholeheartedly pursue the Way, without thinking that you are being cheated - if you give yourself over to rationality without holding back, you will find that rationality gives to you in return.
But that part only works if you don't go around saying to yourself, "It would feel better inside me if only I could be less rational."
Chimpanzees feel, but they don't multiply. Should you be sad that you have the opportunity to do better? You cannot attain your full potential if you regard your gift as a burden.
Added: If you'd still take the dust specks, see Unknown's comment on the problem with qualitative versus quantitative distinctions.
The problem here is that you don't KNOW that the probability is 90%. What if it's 80%? or 60%? or 12%? In real life you will only run the experiment once. The probabilities are just a GUESS. The person who is making the guess has no idea what the real probabilities are. And as Mr. Yudkowsky has pointed out elsewhere, people consistently tend to underestimate the difficulty of a task. They can't even estimate with any accuracy how long it will take them to finish their homework. If you aren't in the business of saving people's lives in EXACTLY this same way, on a regular basis, the estimate of 90% is probably crap. And so is the estimate of 100% probability of saving 400 lives. All you can really say, is that you see fewer difficulties that way, from where you are standing now. It's a crap shoot, either way, because, once you get started, no matter which option you choose, difficulties you hadn't anticipated will arise.
This reminds me of 'the bridge experiment', where a test subject is given the opportunity to throw a fat person off a bridge in front of a train, and thereby save the lives of 5 persons trapped on the tracks up ahead. The psychologists bemoaned the lack of rationality of the test subjects, since most of them wouldn't throw the fat person off the bridge, and thus trade the lives of one person, for five. I was like, 'ARE YOU CRAZY? Do you think one fat person would DERAIL A TRAIN? What do you think cow catchers are for, fool? What if he BOUNCED a couple of times, and didn't end up on the rails? It's preposterous. The odds are 1000 to 1 against success. No sane person would take that bet.'
The psychologists supposedly fixed this concern by telling the test subjects that it was guaranteed that throwing the fat person off the bridge would succeed. Didn't work, because people STILL wouldn't buy into their preposterous plan.
Then the psychologists changed the experiment so that the test subject would just have to throw a switch on the track which would divert the train from the track where the five people were trapped to a track where just one person was trapped (still fat by the way). Far more of the test subjects said they would flip the switch than had said they would throw someone off the bridge. The psychologists suggested some preposterous sounding reason for the difference, I don't even remember what, but it seemed to me that the change was because the plan just seemed a lot more likely to succeed. The test subjects DISCOUNTED the assurances of the psychologists that the 'throw someone off the bridge plan' would succeed. And quite rationally too, if you ask me. What rational person would rely on the opinion of a psychologist on such a matter?
When the 90%/500 or 100%/400 question was posed, I felt myself having exactly the same reaction. I immediately felt DUBIOUS that the odds were actually 90%. I immediately discounted the odds. By quite a bit, in fact. Perhaps that was because of lack of self confidence, or hard won pessimism from years of real life experience, but I immediately discounted the odds. I bet a lot of other people did too. And I wouldn't take the bet, for exactly that reason. I didn't BELIEVE the odds, as given. I was skeptical. Interestingly enough though, I was less skeptical of the 'can't fail/100%' estimate, than of the 90% estimate. Maybe I could easily imagine a scenario where there was no chance of failure at all, but couldn't easily imagine a scenario where the odds were, reliably, 90%. Once you start throwing around numbers like 90%, in an imperfect world, what you're really saying is 'there is SOME chance of failure'. Estimating how much chance, would be very much a judgement call.
So maybe what you're looking at here isn't irrationality, or the inability to multiply, but rather rational pessimism about it being as easy as claimed.