"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.
What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.
For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:
Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?
I think the answer is obvious. How about you?
This question reminds me of the dilemma posed to medical students. It went something like this;
if the opportunity presented itself to secretly, with no chance of being caught, 'accidentally' kill a healthy patient who is seen as wasting their life (smoking, drinking, not exercising, lack of goals etc) in order to harvest his/her organs in order to save 5 other patients should you go ahead with it?
From a utilitarian perspective, it makes perfect sense to commit the murder. The person who introduced me to the dilemma also presented the rationale for saying 'no'... Thankfully it wasn't "It's just wrong" or even "murder is wrong"... The answer suggested was "You wouldn't want to live in a world where doctors might regularly operate in such a manner nor would you want to be a patient in such a system... It would be terrifying".
I suspect the key elements in the hospital and dust speck scenarios are a) someone power over an aspect of other peoples fates and b) the level of trust of those people. The net-sum calculation of overall 'good' might well suggest torture or organ harvesting as the solution, but how would you feel about nominating someone else to be the one who makes that decision... Would you want that person to favor the momentary 3^^^3 dust speck incident or the 50 year torture of an individual?
I think this is an important thing to consider if we intend to make benevolent AI's that are harmonious with our own sense of morality