Vladimir_Nesov comments on Revisiting torture vs. dust specks - Less Wrong

5 [deleted] 08 July 2009 11:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (64)

You are viewing a single comment's thread.

Comment author: RobinZ 09 July 2009 12:22:13PM -1 points [-]

It seems to me that the idea of a critical threshold of suffering might be relevant. Most dust-speckers seem to maintain that a dust speck is always a negligible effect - a momentary discomfort that is immediately forgotten - but in a sufficiently large group of people, randomly selected, a low-probability situation in which a dust speck is critical could arise. For example, the dust speck could be a distraction while operating a moving vehicle, leading to a crash. Or the dust speck could be an additional frustration to an individual already deeply frustrated, leading to an outburst. Each conditional in these hypotheticals is improbable, but multiplying them out surely doesn't result in a number as large as 3^^^3, which means that it is highly likely that many of them will occur. Under this interpretation, the torture is the obvious winner.

If cascading consequences are ruled out, however, I'll have to think some more.

Comment author: Vladimir_Nesov 09 July 2009 12:34:11PM *  0 points [-]

When you, personally, decide between your future containing a dust speck at unknown moment and some alternative, the value of that dust speck won't be significantly affected by the probability of it causing trouble, if probability is low enough.

You could replace a dust speck with 1 in 3^^^3/1000 probability of being tortured for 50 years, so that it's a choice between 3^^^3 people each having a 3^^^3/1000 probability of being tortured, and one person being tortured with certainty, or, derandomizing, a choice between 1000 people tortured and one person tortured. That one person is better be really special, for the proximity effect to elevate them above all those other people.

Comment author: cousin_it 09 July 2009 02:56:57PM *  0 points [-]

The proximity effect, as described in the post, makes your "derandomizing" step invalid.

Comment author: Vladimir_Nesov 09 July 2009 03:06:01PM *  0 points [-]

It can't be invalid: just replace the initial rule by this: of all 3^^^3, a random selection of 1000 will be made who are to be tortured. Given this rule, each individual has about 1 in 3^^^3/1000 probability of getting selected for torture, which is presumably even better deal than a certain speck. This is compared to choosing one person to torture with certainty. The proximity effect may say that those 1000 people are from far away and so of little importance, which I mentioned in the comment above. I don't think the choice of saving one known person over a thousand ridiculously-far-away people is necessarily incorrect though.

Comment author: cousin_it 09 July 2009 04:35:00PM 0 points [-]

Yes, this way is correct. I thought you implied the 1000 people were close, not far away.

Comment author: RobinZ 09 July 2009 01:20:25PM 0 points [-]

Sure, makes sense. I imagine the probability is much less than 3^^^3/1000 of the consequences I'm hypothesizing, though, which makes the dust specks still worse.