"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
I'm fairly certain Elizer ended with "the choice is obvious" to spark discussion, and not because it's actually obvious, but let me go ahead and justify that - this is not an obvious choice, even though there is a clear, correct answer (torture).
There are a few very natural intuitions that we have to analyze and dispel in order to get off the dust specks.
1.) The negative utility of a dust speck rounds down to 0.
If that's the case, 3^^^3*0 = 0, and the torture is worse. The issue with this is two fold.
First, why does it have to be torture on the other side of the equation? If dust speck rounds down to 0, then why can't it be someone spraining their ankle? Or even lightly slapping someone on the wrist? Once we're committed to rounding dust specks to 0, all of the sudden, we are forced to pick the other side of the equation, regardless of how small it is.
This, then, exposes our rounding fallacy. The dust speck's negative utility is not zero, it's just really really small, but when we then apply that really, really small downside to a really, really large number of people, all of the sudden the fact that it is certainly nonzero becomes very relevant. It's just that we need to pick a counter-option that isn't so negative as to blind us with emotional response (like torture does) in order to realize that fact.
The more broad version of the mistake of rounding dust specks to 0 is that it implies there exists some threshold under which all things are of equal utility - they all round down to 0. Where is this threshold? Literally right at dust specks? They round to 0, but something slightly worse than a dust speck doesn't? Or is it a little higher than dust specks?
Regardless, we need only analyze a problem like this on the border of our "rounds to zero" and "doesn't round to zero" utility in order to see the absurdity in this proposition.
2.) What if it's not the absolute utility, but the gap, relative to torture, which causes specks to round to zero?
This second attempt is more promising, but, again, on further analysis falls apart.
One of the best ways to bypass the traditional pitfalls in single choice decisions is to pretend that you're making the choice many times, instead of just once. By doing so, we can close this subjective gap, and end up at an apples-to-apples (or in this case, torture-to-torture) comparison.
We've already established that dust specks do not round down to zero, in an absolute sense, so all I need to do is ask you to make the choice enough times that the 3^^^3 people are essentially being tortured for 50 years.
Specifically, this number of times:
(torture's badness) / (dust specks' badness)
Once you've made the choice that many times, guess what, 3^^^3 people are being tortured for 50 years by dust specs.
If you'd picked the torture every time, (# of choices) people are being tortured for 50 years.
Do you think that the torture is 3^^^3 times worse than the dust speck? (If so, there would be the same amount of people being tortured either way.) I can just change it to make it 40 years of torture instead, or 1 year of torture. Or I can make the dust speck a little less bad.
The thing is, your desire to pick the dust specks doesn't come from rationally asserting that it's 3^^^3 less bad than torture. No matter what you think the factor is, I can always pick numbers that'll make you choose the torture, and your intuition is always going to hate it.
3.) You can't calculate utility with a linear combination. Even if the sum of the dust specks is more negative, the torture is still worse, because it's so much worse for that one person.
Let me be a little more clear about what I mean here. Imagine this choice:
It's very reasonable to pick option #2 here. Even though it's an extra minute of torture, you could argue that utility is not linear in this case - being tortured for 1 minute isn't really that bad, and you can get over it, but being tortured for 10 hours is likely to break a person.
That's fine - your utility function doesn't have to be a strictly linear function with respect to the inputs, but, critically, it is still a function.
You might be tempted to say something along the lines of "I evaluate utility based on avoiding the worst outcome for a single person, rather than based on total utility (therefore I can pick the dust specks)."
The problem is, no matter how how much proportionally less bad you think 1 minute of torture is than 10 hours, I can still always pick a number that causes you to pick the 10 hour option.
What if I change it to 1000 people instead of 601? What about 10,000? What about 3^^^3?
All of the sudden it's clear that picking to avoid the worst possible outcome for a single person is an incoherent intuition - that would force you to torture 3^^^3 people for 9 hours instead of 1 person for 10 hours.
The non-obvious answer is therefore torture.
The dust specks do not round down to zero. The gap between the dust specks and the torture doesn't cause the dust specks to round down to zero, either. You must multiply, and account for the multiplicity of the dust specks, lest you be forced to choose saving one person from -100 from saving hundreds from -99. Even if you discount the multiplicity and don't linearly sum the costs, again, treating them as 0 leads to incoherence, so there is still some cumulative effect.
Therefore, you've gotta go torture, as much as your intuition hates it.
tl;dr - Shut up and multiply.