"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
Then it seems we've reached an agreement, as the agreement theorem says we should. And yes, this is a thought experiment, it is unlikely that anyone will ever have to choose between such extremes (or that 3^^^3 people will ever exist, at once or even in total). However, whether real or not, if one rejects utilitarianism here, they can't simply say "Well it works in all real scenarios though". Eliezer could have just as easily mentioned a utility monster, but he felt like conveying the same thought experiment in a more original way.
First of all, I am for the torture - so are 22.1% of the people recently surveyed vs 36.8% who are for the dust specks -- the rest don't want to respond or are unsure.
Secondly, the issue of small dispersed disutilities vs large concentrated ones is one we constantly encounter in the real world, and time after time society accepts that for the purpose of e.g. the convenience of driving, we can tolerate the unavoidable tradeoff of the occasional traffic accidents. Where we don't sacrifice every tiny little luxury just to gather resources to save a single ex... (read more)