Unnamed comments on Open Thread: June 2010 - Less Wrong

5 Post author: Morendil 01 June 2010 06:04PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (651)

You are viewing a single comment's thread. Show more comments above.

Comment author: mkehrt 02 June 2010 10:19:15PM 7 points [-]

Forgive me if this is beating a dead horse, or if someone brought up an equivalent problem before; I didn't see such a thing.

I went through a lot of comments on dust specks vs. torture. (It seems to me like the two sides were miscommunicating in a very specific way, which I may attempt to make clear at some point.) But now I have an example that seems to be equivalent to DSvs.T, easily understandable via my moral intuition and give the "wrong" (i.e., not purely utilitarian) answer.

Suppose I have ten people and a stick. The appropriate infinitely powerful theoretical being offers me a choice. I can hit all ten of them with a stick, or I can hit one of them nine times. "Hitting with a stick" has some constant negative utility for all the people. What do I do?

This seems to me to be exactly dust specks vs. torture scaled down to humanly intuitable scales. I think the obvious answer is to hit all the people once. Examining my intuition tells me that this is because I think the aggregation function for utility is different across different people than across one person's possible futures. Specifically, my intuition tells me to maximize across people the minimum expected utilty across an individual's future.

So, is there a name for this position?

Do people think my example is equivalent to DSvsT?

Do people get the same or different answer with this question as they do with DSvsT?

Comment author: Unnamed 03 June 2010 03:38:03AM *  7 points [-]

DSvsT was not directly an argument for utilitarianism, it was an argument for tradeoffs and quantitative thinking and against any kind of rigid rules, sacred values, or qualitative thinking which prevents tradeoffs. For any two things, both of which have some nonzero value, there should be some point where you are willing to trade off one for the other - even if one seems wildly less important than the other (like dust specks compared to torture). Utilitarianism provides a specific answer for where that point is, but the DSvsT post didn't argue for the utilitarian answer, just that the point had to be at less than 3^^^3 dust specks. You would probably have to be convinced of utilitarianism as a theory before accepting its exact answer in this particular case.

The stick-hitting example doesn't challenge the claim about tradeoffs, since most people are willing to trade off one person getting hit multiple times with many people each getting hit once, with their choice depending on the numbers. In a stadium full of 100,000 people, for instance, it seems better for one person to get hit twice than for everyone to get hit once. Your alternative rule (maximin) doesn't allow some tradeoffs, so it leads to implausible conclusions in cases like this 100,000x1 vs. 1x2 example.