Alicorn comments on Revisiting torture vs. dust specks - Less Wrong

5 [deleted] 08 July 2009 11:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (64)

You are viewing a single comment's thread.

Comment author: Alicorn 08 July 2009 03:28:01PM 5 points [-]

I prefer dust specks because I insist on counting people one at a time. I think it's obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so. Any of those 3^^^3 people who would not voluntarily do so, I don't have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.

I haven't yet worked out a good way to draw the line in the escalation scenario, since I suspect that "whatever level of discomfort I, personally, wouldn't voluntarily experience to save some random person from 50 years of torture" is unlikely to be the right answer.

Comment author: cousin_it 08 July 2009 04:52:34PM *  1 point [-]

I think it's obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so.

Woah.

How'd they end up responsible for the choice you make? You can't have it both ways, that's not how it works.

Comment author: Alicorn 08 July 2009 04:55:24PM 0 points [-]

My (unfinished, don't ask for too much detail) ethical theory is based on rights, which can be waived by the would-be victim of an act that would otherwise be a rights violation. So in principle, if I could poll the 3^^^3 people, I would expect them to waive the right not to experience the dust specks. They aren't responsible for what I do, but my expectations of their dispositions about my choice inform that choice.

Comment author: cousin_it 08 July 2009 05:08:55PM *  3 points [-]

Then the "real-world analogy" point in the post prompts me to ask a fun question: do you consider yourself entitled to rob everyone else for one penny to save one starving African child? Because if someone would have refused to pay up, you "don't have enough sympathy for such individuals" and take the penny anyway.

Comment author: Alicorn 08 July 2009 05:16:13PM 1 point [-]

Changing the example to one that involves money does wacky things to my intutions, especially since many people live in situations where a penny is not a trivial amount of money (whereas I take it that a dust speck in the eye is pretty much commensurate for everybody), and the fact that there are probably less expensive ways to save lives (so unlike the purely stipulated tradeoff of the dust speck/torture situation, I do not need a penny from everyone to save the starving child).

Comment author: Vladimir_Nesov 08 July 2009 07:02:09PM *  1 point [-]

You are not placing the question in the least convenient possible world.

Comment author: Alicorn 08 July 2009 07:18:36PM -1 points [-]

In the least convenient possible world: I take it that in this case, that world is the one where wealth is distributed equally enough that one penny means the same amount to everybody, and every cheaper opportunity to save a life has already been taken advantage of.

Why would a world that looked like that have a starving African child? If we all have X dollars, so a penny is worth the same to everyone, then doesn't the starving African child also have X dollars? If he does, and X dollars won't buy him dinner, then there just must not be any food in his region (because it doesn't make any sense for people to sell food at a price that literally no one can afford, and everybody only has X dollars) - so X dollars plus (population x 1ยข) probably wouldn't help him either.

Perhaps you had a different inconvenient possible world in mind; can you describe it for me?

Comment author: Vladimir_Nesov 08 July 2009 07:26:34PM -1 points [-]

One where the African child really does need that cent.

Comment author: Alicorn 08 July 2009 07:32:08PM *  1 point [-]

I'm afraid that isn't enough detail for me to understand the question you'd like me to answer.

Comment author: Vladimir_Nesov 08 July 2009 07:41:03PM *  0 points [-]

How's that possible? The question is this: there is, say, a trillion people, each has exactly one cent to give away. If almost every one of them parts with their cent, one life gets saved, otherwise one life is lost. Each of these people can either give up their cent voluntarily, or you, personally, can rob them of that cent (say, you can implement some worldwide policy to do that in bulk). Do you consider it the right choice to rob every one of these people who refuse to pay up?

Comment author: cousin_it 08 July 2009 07:53:19PM *  0 points [-]

Thanks! It seems my question wasn't very relevant to the original dilemma. I vaguely recall arguing with you about your ethical theory some months ago, so let's not go there; but when you eventually finish that stuff, please post it here so we can all take a stab.

Comment author: eirenicon 22 July 2009 07:45:30PM 0 points [-]

So you are enabled to choose dust specks based on your prediction that the 3^^^3 people will waive their rights. However, you "don't have sympathy" for anyone who actually doesn't. Therefore, you are willing to violate the rights of anyone who does not comply with your predicted ethical conclusion. What, then, if all 3^^^3 people refuse to waive their rights? Then you aren't just putting a dust speck into the eyes of 3^^^3 people, you're also violating their rights by your own admission. Doesn't that imply a further compounding of disutility?

I don't see how your ethical theory can possibly function if those who refuse to waive their rights have them stripped away as a consequence.

Comment author: Vladimir_Nesov 08 July 2009 04:17:32PM *  -1 points [-]

I prefer dust specks because I insist on counting people one at a time. I think it's obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so.

This is defection, a suboptimal strategy. Each person in isolation prefers to defect in Prisoner's dilemma.

Any of those 3^^^3 people who would not voluntarily do so, I don't have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.

And this is preference for fuzzies over utility, inability to shut up and multiply.

Comment author: thomblake 08 July 2009 04:30:18PM -2 points [-]

And this is preference for fuzzies over utility, inability to shut up and multiply.

If this is true, then by reductio, preference of utility is incorrect.

Comment author: Peter_de_Blanc 09 July 2009 07:17:48AM 0 points [-]

By the same argument (i.e. refusing to multiply), wouldn't it also be better to torture 100 people for 49 years than to torture one person for 50 years?

Comment author: Alicorn 09 July 2009 03:08:46PM 2 points [-]

I did say:

I haven't yet worked out a good way to draw the line in the escalation scenario, since I suspect that "whatever level of discomfort I, personally, wouldn't voluntarily experience to save some random person from 50 years of torture" is unlikely to be the right answer.

The scenario you present is among those I have no suitable answer for for this reason. However, I lean towards preferring the 50 years of torture for 1 person over 49 years for 100.

Comment author: Vladimir_Nesov 09 July 2009 10:17:00AM 2 points [-]

Not if each of them considers it a wrong choice. Refusing to multiply goes both ways, and no math can debate this choice: whatever thought experiment you present, an intuitive response would be stumped on top and given as a reply.