"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
There are many ways of approaching this question, and one that I think is valuable and which I can't find any mention of on this page of comments is the desirist approach.
Desirism is an ethical theory also sometimes called desire utilitarianism. The desirist approach has many details for which you can Google, but in general it is a form of consequentialism in which the relevant consequences are desire-satisfaction and desire-thwarting.
Fifty years of torture satisfies none and thwarts virtually all desires, especially the most intense desires, for fifty years of one individuals' life, and most of the subsequent years of life also due to extreme psychological damage. Barely noticeable dust specks neither satisfy nor thwart any desires, and so in a population of any finite size the minor pain is of no account whatever in desirist terms. So a desirist would prefer the dust specks.
The Repetition Objection: If this choice was repeated say, a billion times, then the lives of the 3^^^3 people would become unlivable due to constant dust specks, and so at some point it must be that an additional individual tortured becomes preferable to another dust speck in 3^^^3 eyes.
The desirist response bites the bullet. Dust specks in eyes may increase linearly, but their effect on desire-satisfaction and desire-thwarting is highly nonlinear. It's probably the case that an additional torture becomes preferable as soon as the expected marginal utility of the next dust speck is a few million desires thwarted, and certainly the case when the expected marginal utility of the next dust speck is a few billion desires thwarted.
Can you clarify your grounds for claiming that barely noticeable dust specks neither satisfy nor thwart any desires?