Jiro comments on Torture vs. Dust Specks - Less Wrong

39 Post author: Eliezer_Yudkowsky 30 October 2007 02:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (596)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Jiro 11 April 2015 05:12:28AM 0 points [-]

The specific problem which causes that is that most versions of utilitarianism don't allow the fact that someone desires not to be killed to affect the utility calculation, since after they have been killed, they no longer have utility.

Comment author: Wes_W 11 April 2015 05:46:10AM 0 points [-]

Yes, this is a failure mode of (some forms of?) utilitarianism, but not the specific weirdness I was trying to get at, which was that if you aggregate by min(), then it's completely morally OK to do very bad things to huge numbers of people - in fact, it's no worse than radically improving huge numbers of lives - as long as you avoid affecting the one person who is worst-off. This is a very silly property for a moral system to have.

You can attempt to mitigate this property with too-clever objections, like "aha, but if you kill a happy person, then in the moment of their death they are temporarily the most unhappy person, so you have affected the metric after all". I don't think that actually works, but didn't want it to obscure the point, so I picked "kill their dog" as an example, because it's a clearly bad thing which definitely doesn't bump anyone to the bottom.