Followup to: Torture vs. Dust Specks, Zut Allais, Rationality Quotes 4
Suppose that a disease, or a monster, or a war, or something, is killing people. And suppose you only have enough resources to implement one of the following two options:
- Save 400 lives, with certainty.
- Save 500 lives, with 90% probability; save no lives, 10% probability.
Most people choose option 1. Which, I think, is foolish; because if you multiply 500 lives by 90% probability, you get an expected value of 450 lives, which exceeds the 400-life value of option 1. (Lives saved don't diminish in marginal utility, so this is an appropriate calculation.)
"What!" you cry, incensed. "How can you gamble with human lives? How can you think about numbers when so much is at stake? What if that 10% probability strikes, and everyone dies? So much for your damned logic! You're following your rationality off a cliff!"
Ah, but here's the interesting thing. If you present the options this way:
- 100 people die, with certainty.
- 90% chance no one dies; 10% chance 500 people die.
Then a majority choose option 2. Even though it's the same gamble. You see, just as a certainty of saving 400 lives seems to feel so much more comfortable than an unsure gain, so too, a certain loss feels worse than an uncertain one.
You can grandstand on the second description too: "How can you condemn 100 people to certain death when there's such a good chance you can save them? We'll all share the risk! Even if it was only a 75% chance of saving everyone, it would still be worth it - so long as there's a chance - everyone makes it, or no one does!"
You know what? This isn't about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain's feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn't even a feather in the scales, when a life is at stake. Just shut up and multiply.
Previously on Overcoming Bias, I asked what was the least bad, bad thing that could happen, and suggested that it was getting a dust speck in your eye that irritated you for a fraction of a second, barely long enough to notice, before it got blinked away. And conversely, a very bad thing to happen, if not the worst thing, would be getting tortured for 50 years.
Now, would you rather that a googolplex people got dust specks in their eyes, or that one person was tortured for 50 years? I originally asked this question with a vastly larger number - an incomprehensible mathematical magnitude - but a googolplex works fine for this illustration.
Most people chose the dust specks over the torture. Many were proud of this choice, and indignant that anyone should choose otherwise: "How dare you condone torture!"
This matches research showing that there are "sacred values", like human lives, and "unsacred values", like money. When you try to trade off a sacred value against an unsacred value, subjects express great indignation (sometimes they want to punish the person who made the suggestion).
My favorite anecdote along these lines - though my books are packed at the moment, so no citation for now - comes from a team of researchers who evaluated the effectiveness of a certain project, calculating the cost per life saved, and recommended to the government that the project be implemented because it was cost-effective. The governmental agency rejected the report because, they said, you couldn't put a dollar value on human life. After rejecting the report, the agency decided not to implement the measure.
Trading off a sacred value (like refraining from torture) against an unsacred value (like dust specks) feels really awful. To merely multiply utilities would be too cold-blooded - it would be following rationality off a cliff...
But let me ask you this. Suppose you had to choose between one person being tortured for 50 years, and a googol people being tortured for 49 years, 364 days, 23 hours, 59 minutes and 59 seconds. You would choose one person being tortured for 50 years, I do presume; otherwise I give up on you.
And similarly, if you had to choose between a googol people tortured for 49.9999999 years, and a googol-squared people being tortured for 49.9999998 years, you would pick the former.
A googolplex is ten to the googolth power. That's a googol/100 factors of a googol. So we can keep doing this, gradually - very gradually - diminishing the degree of discomfort, and multiplying by a factor of a googol each time, until we choose between a googolplex people getting a dust speck in their eye, and a googolplex/googol people getting two dust specks in their eye.
If you find your preferences are circular here, that makes rather a mockery of moral grandstanding. If you drive from San Jose to San Francisco to Oakland to San Jose, over and over again, you may have fun driving, but you aren't going anywhere. Maybe you think it a great display of virtue to choose for a googolplex people to get dust specks rather than one person being tortured. But if you would also trade a googolplex people getting one dust speck for a googolplex/googol people getting two dust specks et cetera, you sure aren't helping anyone. Circular preferences may work for feeling noble, but not for feeding the hungry or healing the sick.
Altruism isn't the warm fuzzy feeling you get from being altruistic. If you're doing it for the spiritual benefit, that is nothing but selfishness. The primary thing is to help others, whatever the means. So shut up and multiply!
And if it seems to you that there is a fierceness to this maximization, like the bare sword of the law, or the burning of the sun - if it seems to you that at the center of this rationality there is a small cold flame -
Well, the other way might feel better inside you. But it wouldn't work.
And I say also this to you: That if you set aside your regret for all the spiritual satisfaction you could be having - if you wholeheartedly pursue the Way, without thinking that you are being cheated - if you give yourself over to rationality without holding back, you will find that rationality gives to you in return.
But that part only works if you don't go around saying to yourself, "It would feel better inside me if only I could be less rational."
Chimpanzees feel, but they don't multiply. Should you be sad that you have the opportunity to do better? You cannot attain your full potential if you regard your gift as a burden.
Added: If you'd still take the dust specks, see Unknown's comment on the problem with qualitative versus quantitative distinctions.
Eliezer's question for Paul is not particularly subtle, so I presume he won't mind if I give away where it is leading. If Paul says yes, there is some number of dust specks which add up to a toe stubbing, then Eliezer can ask if there is some number of toe stubbings that add up to a nipple piercing. If he says yes to this, he will ultimately have to admit that there is some number of dust specks which add up to 50 years of torture.
Rather than actually going down this road, however, perhaps it would be as well if those who wish to say that the dust specks are always preferable to the torture should the following facts:
1) Some people have a very good imagination. I could personally think of at least 100 gradations between a dust speck and a toe stubbing, 100 more between the toe stubbing and the nipple piercing, and as many as you like between the nipple piercing and the 50 years of torture.
2) Arguing about where to say no, the lesser pain can never add up to the slightly greater pain, would look a lot like creationists arguing about which transitional fossils are merely ape-like humans, and which are merely human-like apes. There is a point in the transitional fossils where the fossil is so intermediate that 50% of the creationists say that it is human, and 50% that it is an ape. Likewise, there will be a point where 50% of the Speckists say that dust specks can add up to this intermediate pain, but the intermediate pain can't add up to torture, and the other 50% will say that the intermediate pain can add up to torture, but the specks can't add up the intermediate pain. Do you really want to go down this path?
3) Is your intuition about the specks being preferable to the torture really greater than the intuition you violate by positing such an absolute division? Suppose we go down the path mentioned above, and at some point you say that specks can add up to pain X, but not to pain X+.00001 (a representation of the minute degree of greater pain in the next step if we choose a fine enough division). Do you really want to say that you prefer that a trillion persons (or a google, or a googleplex, etc) than that one person suffer pain X+.00001?
While writing this, Paul just answered no, the specks never add up to a toe stub. This actually suggests that he rounds down the speck to nothing; you don't even notice it. Remember however that originally Eliezer posited that you feel the irritation for a fraction of a second. So there is some pain there. However, Paul's answer to this question is simply a step down the path laid out above. I would like to see his answer to the above. Remember the (minimally) 100 gradations between the dust speck and the toe stub.
But consider this: the last exemplars of each species of hominids could reproduce with the firs exemplars of the following.
However, we probably wouldn’t be able to reproduce with Homo habilis.
This shows that small differences sum as the distance between the examined subjects increases, until we can clearly see that the two subjects are not part of the same category anymore.
The pains that are similar in intensity are still comparable. But there is too much difference between dust specks in the eye/stubbed toe and torture to consider them as part of the same category