Salivanth comments on Circular Altruism - Less Wrong

40 Post author: Eliezer_Yudkowsky 22 January 2008 06:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (300)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Hul-Gil 01 May 2012 06:22:43PM *  10 points [-]

Well, he didn't actually identify dust mote disutility as zero; he says that dust motes register as zero on his torture scale. He goes on to mention that torture isn't on his dust-mote scale, so he isn't just using "torture scale" as a synonym for "disutility scale"; rather, he is emphasizing that there is more than just a single "(dis)utility scale" involved. I believe his contention is that the events (torture and dust-mote-in-the-eye) are fundamentally different in terms of "how the mind experiences and deals with [them]", such that no amount of dust motes can add up to the experience of torture... even if they (the motes) have a nonzero amount of disutility.

I believe I am making much the same distinction with my separation of disutility into trivial and non-trivial categories, where no amount of trivial disutility across multiple people can sum to the experience of non-trivial disutility. There is a fundamental gap in the scale (or different scales altogether, à la Jones), a difference in how different amounts of disutility work for humans. For a more concrete example of how this might work, suppose I steal one cent each from one billion different people, and Eliezer steals $100,000 from one person. The total amount of money I have stolen is greater than the amount that Eliezer has stolen; yet my victims will probably never even realize their loss, whereas the loss of $100,000 for one individual is significant. A cent does have a nonzero amount of purchasing power, but none of my victims have actually lost the ability to purchase anything; whereas Eliezer's, on the other hand, has lost the ability to purchase many, many things.

I believe utility for humans works in the same manner. Another thought experiment I found helpful is to imagine a certain amount of disutility, x, being experienced by one person. Let's suppose x is "being brutally tortured for a week straight". Call this situation A. Now divide this disutility among people until we have y people all experiencing (1/y)*x disutility - say, a dust speck in the eye each. Call this situation B. If we can add up disutility like Eliezer supposes in the main article, the total amount of disutility in either situation is the same. But now, ask yourself: which situation would you choose to bring about, if you were forced to pick one?

Would you just flip a coin?

I believe few, if any, would choose situation A. This brings me to a final point I've been wanting to make about this article, but have never gotten around to doing. Mr. Yudkowsky often defines rationality as winning - a reasonable definition, I think. But with this dust speck scenario, if we accept Mr. Yudkowsky's reasoning and choose the one-person-being-tortured option, we end up with a situation in which every participant would rather that the other option had been chosen! Certainly the individual being tortured would prefer that, and each potentially dust-specked individual* would gladly agree to experience an instant of dust-speckiness in order to save the former individual.

I don't think this is winning; no one is happier with this situation. Like Eliezer says in reference to Newcomb's problem, if rationality seems to be telling us to go with the choice that results in losing, perhaps we need to take another look at what we're calling rationality.


*Well, assuming a population like our own, not every single individual would agree to experience a dust speck in the eye to save the to-be-tortured individual; but I think it is clear that the vast majority would.

Comment author: Salivanth 25 May 2012 12:50:53PM 1 point [-]

You might be right. I'll have to think about this, and reconsider my stance. One billion is obviously far less than 3^^^3, but you are right in that the 10 million dollars stolen by you would be preferable to me than the 100,000 dollars stolen by Eliezer. I also consider losing 100,000 dollars less than or equal to 100,000 times as bad as losing one dollar. This indicates one of two things:

A) My utility system is deeply flawed. B) My utility system includes some sort of 'diffiusion factor' wherein a disutility of X becomes <X when divided among several people, and the disutility becomes lower the more people it's divided among. In essence, there is some disutility for one person suffering a lot of disutility, that isn't there when it's divided among a lot of people.

Of this, B seems more likely, and I didn't take it into account when considering torture vs. dust specks. In any case, some introspection on this should help me further define my utility function, so thanks for giving me something to think about.