"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
Now, this is considerably better reasoning - however, there was no clue to this being a decision that would be selected over and over by countless of people. Had it been worded "you among many have to make the following choice...", I could agree with you. But the current wording implied that it was once-a-universe sort of choice.
The choice doesn't have to be repeated to present you with the dilemma. Since all elements of the problem are finite - not countless, finite - if you refuse all actions in the chain, you should also refuse the start of the chain even when no future repetitions are presented as options. This kind of reasoning doesn't work for infinite cases, but it works for finite ones.
One potential counter to the "global heating" example is that at some point, people begin to die who would not otherwise have done so, and that should be the point of refusal. But for the case of dust specks - and we can imagine getting more than one dust speck in your eye per day - it doesn't seem like there should be any sharp borderline.
We face the real-world analogue of this problem every day, when we decide whether to tax everyone in the First World one penny in order to save one starving African child by mounting a large military rescue operation that swoops in, takes the one child, and leaves.
There is no "special penny" where this logic goes from good to bad. It's wrong when repeated because it's also wrong in the individual case. You just have to come to terms with scope sensitivity.
"Swoops in, takes one child, and leaves"... wow. I'd like to say I can't imagine being so insensitive as to think this would be a good thing to do (even if not worth the money), but I actually can.
And why would you use that horrible example, when the arguement would work just fine if you substituted "A permanent presence devoted to giving one person three square meals a day."