Taurus_Londono comments on Circular Altruism - Less Wrong

40 Post author: Eliezer_Yudkowsky 22 January 2008 06:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (300)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Taurus_Londono 29 November 2013 04:28:43PM 2 points [-]

Raise your hand if you (yes you, the person reading this) will submit to 50 years of torture in order to avert "least bad" dust speck momentarily finding its way into the eyes of an unimaginably large number of people.

Why was it not written "I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes"?

Why restrict yourself to the comforting distance of omniscience?

Did Miyamoto Musashi ever exhort the reader to ask his sword what he should want? Why is this not a case of using a tool as an end in and of itself rather than as a means to achieve a desired end?

Are you irrational if your something to protect is yourself...from torture?

Has anyone ever addressed whether or not this applies to the AGI Utility Monster whose experiential capacity would presumably exceed the ~7 billion humans who should rationally subserve Its interests (whatever they may be)?

Comment author: TheOtherDave 29 November 2013 06:21:47PM 3 points [-]

I would not submit to 50 years of torture to avert a dust speck in the eyes of lots of people.
I suspect I also would not submit to 50 years of torture to avert a stranger being subjected to 55 years of torture.
It's not clear to me what, if anything, I should infer from this.

Comment author: hyporational 30 November 2013 07:18:38AM *  1 point [-]

Ready the tar and feather, but I woudn't submit myself to even 1 year of torture to avert a stranger being tortured 50 years if no terrible social repercussions could be expected.

Comment author: TheOtherDave 30 November 2013 03:40:42PM 0 points [-]

Yup. I suspect that's true of the overwhelming majority of people. It's most likely true of me.

Comment author: [deleted] 30 November 2013 09:37:04AM 2 points [-]

That you value yourself more than a stranger. (I don't think there's anything wrong with that, BTW, so long as this doesn't mean you'd defect in a PD against them.)

Comment author: TheOtherDave 30 November 2013 03:43:47PM 0 points [-]

Sure. Sorry, what I meant was it's not clear what I should infer from this about the relative harmfulness of 50 years of torture, 55 years of torture, and Dust Specks.

Mostly, what it seems to imply is that "would I choose A over B?" doesn't necessarily have much to do with the harmfulness to the system as a whole of A and B.

Comment author: ArisKatsaris 29 November 2013 08:10:27PM *  4 points [-]

I suffer under no delusion that I'm a morally perfect individual.

You seem to believe that to identify what's the morally correct path, one must also be willing to follow it. Morality pushes our wills towards that direction, but selfishness has its own role to play and here it pushes elsewhere.

But yes, I am willing to say that I should submit to 50 years of torture in order to save 3^^^3 people getting dust specks in their eyes. I'll also openly admit that that I am not willing to submit to such. This is not contradictory: "should" is a moral judgment, but being willing to be moral at such high cost is another thing entirely.

Comment author: hyporational 30 November 2013 07:15:25AM *  0 points [-]

Why was it not written "I, Eliezer Yudkowsky, should choose to submit to 50 years of torture in place of a googolplex people getting dust specks in their eyes"?

Because then it's clearly not the same argument anymore, and would appeal only to people who ascribe to even a narrower form of incredibly altruistic utilitarianism, who I personally suspect don't even exist statistically speaking. Say the person chosen for torture is random, then it would make a bit more sense, but would essentially be the same argument given the ridiculously high numbers involved.