In line with my fine tradition of beating old horses, in this post I'll try to summarize some arguments that people proposed in the ancient puzzle of Torture vs. Dust Specks and add some of my own. Not intended as an endorsement of either side. (I do have a preferred side, but don't know exactly why.)
- The people saying one dust speck is "zero disutility" or "incommensurable utilities" are being naive. Just pick the smallest amount of suffering that in your opinion is non-zero or commensurable with the torture and restart.
- Escalation argument: go from dust specks to torture in small steps, slightly increasing the suffering and massively decreasing the number of people at each step. If each individual change increases utility, so does the final result.
- Fluctuation argument: the probability that the universe randomly subjects you to the torture scenario is considerably higher than 1/3^^^3 anyway, so choose torture without worries even if you're in the affected set. (This doesn't assume the least convenient possible world, so fails.)
- Proximity argument: don't ask me to value strangers equally to friends and relatives. If each additional person matters 1% less than the previous one, then even an infinite number of people getting dust specks in their eyes adds up to a finite and not especially large amount of suffering. (This assumption negates the escalation argument once you do the math.)
- Real-world analogy: we don't decide to pay one penny each to collectively save one starving African child, so choose torture. (This is resolved by the proximity argument.)
- Observer splitting: if you split into 3^^^3 people tomorrow, would you prefer all of you to get dust specks, or one of you to be tortured for 50 years? (This neutralizes the proximity argument, but the escalation argument also becomes non-obvious.)
Oh what a tangle. I guess Eliezer is too altruistic to give up torture no matter what we throw at him; others will adopt excuses to choose specks; still others will stay gut-convinced but logically puzzled, like me. The right answer, or the right theory to guide you to the answer, no longer seems so inevitable and mathematically certain.
Edit: I submitted this post to LW by mistake, then deleted it which turned out to be the real mistake. Seeing the folks merrily discussing away in the comments long after the deletion, I tried to undelete the post somehow, but nothing worked. All right; let this be a sekrit area. A shame, really, because I just thought of a scenario that might have given even Eliezer cause for self-doubt:
- Observer splitting with a twist: instead of you, one of your loved ones will be split into 3^^^3 people tomorrow. Torture a single branch for 50 years, or give every branch a dust speck?
In the least convenient possible world: I take it that in this case, that world is the one where wealth is distributed equally enough that one penny means the same amount to everybody, and every cheaper opportunity to save a life has already been taken advantage of.
Why would a world that looked like that have a starving African child? If we all have X dollars, so a penny is worth the same to everyone, then doesn't the starving African child also have X dollars? If he does, and X dollars won't buy him dinner, then there just must not be any food in his region (because it doesn't make any sense for people to sell food at a price that literally no one can afford, and everybody only has X dollars) - so X dollars plus (population x 1¢) probably wouldn't help him either.
Perhaps you had a different inconvenient possible world in mind; can you describe it for me?
One where the African child really does need that cent.