This post’s aim is not to prove or disprove the morality of utilitarianism. I only want to raise some provocative questions through mind experiments that shows extreme consequences of some forms of utilitarianism.
Let’s assume that Omega has an evil twin: Psi.
Psi really likes to play with human minds, and today it will force some people to do difficult choices.
Experiment A
Let’s suppose that Alice is a very nice person: she does volunteer, donates moneys to important causes, is vegetarian, and she is always kind to the people she interact with.
Bob on the other hand is a murderer and torturer: he has kidnapped lots of people and then tortured them for many days in horrible ways that are worse than your deepest fears.
Psi offers two option:
A) Alice will be tortured for 1 day and then killed, while Bob will be instantly killed in a painless way.
B) Bob will be tortured for 3 days and then killed, while Alice will be instantly killed in a painless way.
If you refuse to answer, everyone will be tortured for a year and then killed.
No matter what you will choose, you too will be killed and no one will ever know what has appended.
1. What would you answer?
2. What would you answer if the 3 days in option B) were substituted by 3 years?
3. And if they were substituted by 33 years?
Experiment B
A) You will have to bear 3^^^3-1 lives full of happiness/fun/wonder/Eudaimonia plus a life in which you will be tortured for 50 years, this life could happen in any moment during the sequence.
B) You will have to bear 3^^^3 lives full of happiness/fun/wonder/Eudaimonia in each of which you will stub one of your toe, with no long term consequences except a ruined day.
In both case, your lives won’t be influenced by the previous ones, and your character won't be altered during the reincarnation.
But the most important fact is that, at the apex of the pain, Psi will offer to you a painless death. If you will accept, then you will die as promised and you won’t reincarnate again.
1. What option would you choose, if you wanted to live the greatest number of happy lives?
2. What would be the best course of action from the utilitarian point of view? Would you really be able to undertake it?
3. Should pain tolerance been taken in account in the utility calculus? If yes how?
4. Do you think is right to impose something unbearable to avoid something bearable?
Experiment C
Psi gives you two options:
A) All the humanity will experience 100 years of happiness/fun/Eudaimonia/whatever is your definition of utility, and then everyone will die.
B) Except one hundred humans who will experience 3^^^^^^^3 years of happiness/fun/Eudaimonia/whatever is your definition of utility, all the humanity will experience 100 years of torture, and then everyone will die.
1. What would you do if you were forced to choose between these options?
2. Is your choice compatible with your ethical system, or does it derive from the fear to be unlucky?
P.S. I am not from an English speaking country, any advice about style or grammar is appreciated.
This post’s aim is not to prove or disprove the morality of utilitarianism. I only want to raise some provocative questions through mind experiments that shows extreme consequences of some forms of utilitarianism.
Let’s assume that Omega has an evil twin: Psi.
Psi really likes to play with human minds, and today it will force some people to do difficult choices.
Experiment A
Let’s suppose that Alice is a very nice person: she does volunteer, donates moneys to important causes, is vegetarian, and she is always kind to the people she interact with.
Bob on the other hand is a murderer and torturer: he has kidnapped lots of people and then tortured them for many days in horrible ways that are worse than your deepest fears.
Psi offers two option:
A) Alice will be tortured for 1 day and then killed, while Bob will be instantly killed in a painless way.
B) Bob will be tortured for 3 days and then killed, while Alice will be instantly killed in a painless way.
If you refuse to answer, everyone will be tortured for a year and then killed.
No matter what you will choose, you too will be killed and no one will ever know what has appended.
1. What would you answer?
2. What would you answer if the 3 days in option B) were substituted by 3 years?
3. And if they were substituted by 33 years?
Experiment B
A) You will have to bear 3^^^3-1 lives full of happiness/fun/wonder/Eudaimonia plus a life in which you will be tortured for 50 years, this life could happen in any moment during the sequence.
B) You will have to bear 3^^^3 lives full of happiness/fun/wonder/Eudaimonia in each of which you will stub one of your toe, with no long term consequences except a ruined day.
In both case, your lives won’t be influenced by the previous ones, and your character won't be altered during the reincarnation.
But the most important fact is that, at the apex of the pain, Psi will offer to you a painless death. If you will accept, then you will die as promised and you won’t reincarnate again.
1. What option would you choose, if you wanted to live the greatest number of happy lives?
2. What would be the best course of action from the utilitarian point of view? Would you really be able to undertake it?
3. Should pain tolerance been taken in account in the utility calculus? If yes how?
4. Do you think is right to impose something unbearable to avoid something bearable?
Experiment C
Psi gives you two options:
A) All the humanity will experience 100 years of happiness/fun/Eudaimonia/whatever is your definition of utility, and then everyone will die.
B) Except one hundred humans who will experience 3^^^^^^^3 years of happiness/fun/Eudaimonia/whatever is your definition of utility, all the humanity will experience 100 years of torture, and then everyone will die.
1. What would you do if you were forced to choose between these options?
2. Is your choice compatible with your ethical system, or does it derive from the fear to be unlucky?
P.S. I am not from an English speaking country, any advice about style or grammar is appreciated.