lockeandkeynes comments on Torture vs. Dust Specks - Less Wrong

39 Post author: Eliezer_Yudkowsky 30 October 2007 02:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (596)

Sort By: Old

You are viewing a single comment's thread.

Comment author: lockeandkeynes 07 July 2010 07:31:12PM 0 points [-]

I'd gladly get a speck of dust in my eye as many times as I can, and I'm sure those 3^^^3 people would join me, to keep one guy from being tortured for 50 years.

Comment author: Vladimir_Nesov 07 July 2010 07:34:49PM 4 points [-]

Maybe you will indeed, but should you?

Comment author: RobinZ 07 July 2010 08:01:46PM 0 points [-]

Suppose some fraction of the 3^^^3 dropped out. How many dust specks would you be willing to take? Two? Ten? A thousand? A million? A billion? That's half a millimeter in diameter, now, and we're only at 10^9. How about 10^12? 10^15? 10^18? We're around half a meter in diameter now, approaching or exceeding the size of a football, and we've not even reached 3^^4 - and remember that 3^^^3 is 3^^3^^3 = 3^^7,625,597,484,987.

What, you think that all of the 3^^^3 will go for it? All of them, chipping in to save one person who was getting 50 years of torture? In a universe with 3^^^3 people in it, how many people do you think are being tortured? Our planet has had around 10^11 human beings in history. If we say that only one of those 10^11 people were ever tortured for 50 years in history - or even that there were a one-in-a-thousand chance of it, one in 10^14 - how many people would be tortured for 50 years among the more than 3^^^3 we are positing? And do you think that all 3^^^3 will choose the same one you did?

Would you consider think that, perhaps, one dust speck is a bit much to pay to save one part in 3^^^3 of a victim?

Comment author: Vladimir_Nesov 07 July 2010 08:22:00PM 2 points [-]

Would you consider think that, perhaps, one dust speck is a bit much to pay to save one part in 3^^^3 of a victim?

When multiple agents coordinate, their decision delivers the whole outcome, not a part of it. Depending on what you decide, everyone who reasons similarly will decide. Thus, you have the absolute control over what outcome to bring about, even if you are only one of a gazillion like-minded voters.

Here, you decide whether to save one person, at the cost of harming 3^^^3 people. This is not equivalent to saving 1/3^^^3 of a person at the cost of harming one person, because the saving of 1/3^^^3 of a person is not something that actually could happen, it is at best utilitarian simplification, which you must make explicit and not confuse for a decision-theoretic construction.

Comment author: RobinZ 07 July 2010 10:14:10PM 0 points [-]

If it were a one-shot deal with no cheaper alternative, I could see agreeing. But that still leaves the other 3^^^3/10^14 victims and this won't scale to deal with those.

Comment author: Nick_Tarleton 08 July 2010 12:04:39AM 0 points [-]

This seems to work nearly as well for any harm less than being tortured for 50 years — say, being tortured for 25 years.

Comment author: cousin_it 08 July 2010 12:07:12AM *  3 points [-]

I wouldn't volunteer for 25 years of torture to save a random person from 50. A relative, maybe.