Eliezer_Yudkowsky comments on The "Intuitions" Behind "Utilitarianism" - Less Wrong

29 Post author: Eliezer_Yudkowsky 28 January 2008 04:29PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (193)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 28 January 2008 06:02:24PM 13 points [-]

Eliezer, to be clear, do you still think that 3^^^3 people having momentary eye irritations--from dust-specs--is worth torturing a single person for 50 years, or is there a possibility that you did the math incorrectly for that example?

No. I used a number large enough to make math unnecessary.

I specified the dust specks had no distant consequences (no car crashes etc.) in the original puzzle.

Unless the torture somehow causes Vast consequences larger than the observable universe, or the suicide of someone who otherwise would have been literally immortal, it doesn't matter whether the torture has distant consequences or not.

I confess I didn't think of the suicide one, but I was very careful to choose an example that didn't involve actually killing anyone, because there someone was bound to point out that there was a greater-than-tiny probability that literal immortality is possible and would otherwise be available to that person.

So I will specify only that the torture does not have any lasting consequences larger than a moderately sized galaxy, and then I'm done. Nothing bound by lightspeed limits in our material universe can morally outweigh 3^^^3 of anything noticeable. You'd have to leave our physics to do it.

You know how some people's brains toss out the numbers? Well, when you're dealing with a number like 3^^^3 in a thought experiment, you can toss out the event descriptions. If the thing being multiplied by 3^^^3 is good, it wins. If the thing being multiplied by 3^^^3 is bad, it loses. Period. End of discussion. There are no natural utility differences that large.

Comment author: bgaesop 09 January 2011 08:16:31AM *  3 points [-]

I really don't see why I can't say "the negative utility of a dust speck is 1 over Graham's Number." or "I am not obligated to have my utility function make sense in contexts like those involving 3^^^^3 participants, because my utility function is intended to be used in This World, and that number is a physical impossibility in This World."

As a separate response, what's wrong with this calculation: I base my judgments largely on the duration of the disutility. After 1 second, the dust specks disappear and are forgotten, and so their disutility also disappears. The same is not true of the torture; the torture is therefore worse. I can foresee some possible problems with this line of thought, but it's 2:30 am in New Orleans and I just got done with a long evening of drinking and Joint Mathematics Meeting, so please forgive me if I don't attempt to formalize it now.

An addendum: 2 more things. The difference between a life with n dust specks hitting your eye and n+1 dust specks is not worth considering, given how large n is in any real life. Furthermore, if we allow for possible immortality, n could literally be infinity, so the difference would be literally 0.

Secondly, by virtue of your asserting that there exists an action with minimal disutility, you've shown that the Field of Utility is very different from the field of, say, the Real numbers, and so I am incredulous that we can simply "multiply" in the usual sense.

Comment author: kaz 19 August 2011 01:00:48AM *  8 points [-]

I really don't see why I can't say "the negative utility of a dust speck is 1 over Graham's Number."

You can say anything, but Graham's number is very large; if the disutility of an air molecule slamming into your eye were 1 over Graham's number, enough air pressure to kill you would have negligible disutility.

or "I am not obligated to have my utility function make sense in contexts like those involving 3^^^^3 participants, because my utility function is intended to be used in This World, and that number is a physical impossibility in This World."

If your utility function ceases to correspond to utility at extreme values, isn't it more of an approximation of utility than actual utility? Sure, you don't need a model that works at the extremes - but when a model does hold for extreme values, that's generally a good sign for the accuracy of the model.

An addendum: 2 more things. The difference between a life with n dust specks hitting your eye and n+1 dust specks is not worth considering, given how large n is in any real life. Furthermore, if we allow for possible immortality, n could literally be infinity, so the difference would be literally 0.

If utility is to be compared relative to lifetime utility, i.e. as (LifetimeUtility + x / LifetimeUtility), doesn't that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?

Secondly, by virtue of your asserting that there exists an action with minimal disutility, you've shown that the Field of Utility is very different from the field of, say, the Real numbers, and so I am incredulous that we can simply "multiply" in the usual sense.

Eliezer's point does not seem to me predicated on the existence of such a value; I see no need to assume multiplication has been broken.

Comment author: bgaesop 22 August 2011 07:32:51AM 2 points [-]

if the disutility of an air molecule slamming into your eye were 1 over Graham's number, enough air pressure to kill you would have negligible disutility.

Yes, this seems like a good argument that we can't add up disutility for things like "being bumped into by particle type X" linearly. In fact, it seems like having 1, or even (whatever large number I breathe in a day) molecules of air bumping into me is a good thing, and so we can't just talk about things like "the disutility of being bumped into by kinds of particles".

If your utility function ceases to correspond to utility at extreme values, isn't it more of an approximation of utility than actual utility?

Yeah, of course. Why, do you know of some way to accurately access someone's actually-existing Utility Function in a way that doesn't just produce an approximation of an idealization of how ape brains work? Because me, I'm sitting over here using an ape brain to model itself, and this particular ape doesn't even really expect to leave this planet or encounter or affect more than a few billion people, much less 3^^^3. So it's totally fine using something accurate to a few significant figures, trying to minimize errors that would have noticeable effects on these scales.

Sure, you don't need a model that works at the extremes - but when a model does hold for extreme values, that's generally a good sign for the accuracy of the model.

Yes, I agree. Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that's a bad sign for your model.

doesn't that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?

Yeah, absolutely, I definitely agree with that.

Comment author: kaz 26 August 2011 01:58:46AM 0 points [-]

Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that's a bad sign for your model.

That would be failing, but 3^^^3 people blinking != you blinking. You just don't comprehend the size of 3^^^3.

Yeah, absolutely, I definitely agree with that.

Well it's self evident that that's silly. So, there's that.

Comment author: Douglas_Reay 24 February 2012 01:51:08AM 4 points [-]

Unless the torture somehow causes Vast consequences larger than the observable universe, or the suicide of someone who otherwise would have been literally immortal, it doesn't matter whether the torture has distant consequences or not.

What about the consequences of the precedent set by the person making the decision that it is ok to torture an innocent person, in such circumstances? If such actions get officially endorsed as being moral, isn't that going to have consequences which mean the torture won't be a one-off event?

There's a rather good short story about this, by Ursula K LeGuin:

The Ones Who Walk Away From Omelas

Comment author: gwern 24 February 2012 02:32:24AM 4 points [-]

If such actions get officially endorsed as being moral, isn't that going to have consequences which mean the torture won't be a one-off event?

Why would it?

And I don't think LeGuin's story is good - it's classic LeGuin, by which I mean enthymematic, question-begging, emotive substitution for thought, which annoyed me so much that I wrote my own reply.

Comment author: Alicorn 24 February 2012 03:46:42AM 6 points [-]

I've read your story three times now and still don't know what's going on in it. Can I have it in the form of an explanation instead of a story?

Comment author: gwern 24 February 2012 03:58:59AM 0 points [-]

Sure, but you'll first have to provide an explanation of LeGuin's.

Comment author: Alicorn 24 February 2012 04:16:34AM *  2 points [-]

There is this habitation called Omelas in which things are pretty swell for everybody except one kid who is kept in lousy conditions; by unspecified mechanism this is necessary for things to be pretty swell for everybody else in Omelas. Residents are told about the kid when they are old enough. Some of them do not approve of the arrangement and emigrate.

Something of this form about your story will do.

Comment author: gwern 24 February 2012 04:21:52AM 1 point [-]

There is this city called Acre where things are pretty swell except for this one guy who has a lousy job; by a well-specified mechanism, his job makes him an accessary to murders which preserve the swell conditions. He understands all this and accepts the overwhelmingly valid moral considerations, but still feels guilty - in any human paradise, there will be a flaw.

Comment author: Alicorn 24 February 2012 04:37:42AM 2 points [-]

Since the mechanism is well-specified, can you specify it?

Comment author: gwern 24 February 2012 05:02:53AM 0 points [-]

I thought it was pretty clear in the story. It's not easy coming up with analogues to crypto, and there's probably holes in my lock scheme, but good enough for a story.

Comment author: Alicorn 24 February 2012 05:11:02AM *  4 points [-]

I thought it was pretty clear in the story.

Please explain it anyway.

(It never goes well for me when I reply to this sort of thing with snark. So I edited away a couple of drafts of snark.)

Comment author: hairyfigment 24 February 2012 06:11:41AM 8 points [-]

"Omelas" contrasts the happiness of the citizens with the misery of the child. I couldn't tell from your story that the tradesman felt unusually miserable, nor that the other people of his city felt unusually happy. Nor do I know how this affects your reply to LeGuin, since I can't detect the reply.

Comment author: NancyLebovitz 24 February 2012 06:17:59PM 1 point [-]

For what it's worth, some people read "Omelas" as being about a superstition that torturing a child is necessary (see the bit about good weather) rather than a situation where torturing a child is actually contributing to public welfare.

Comment author: gwern 24 February 2012 06:46:13PM 0 points [-]

And the 'wisdom of their scholars' depends on the torture as well? 'terms' implies this is a magical contract of some sort. No mechanism, of course, like most magic and all of LeGuin's magic that I've read (Earthsea especially).

Comment author: MileyCyrus 24 February 2012 07:54:07AM -2 points [-]

America kills 20,000 people/yr via air pollution.. Are you ready to walk away?

Comment author: thomblake 24 February 2012 04:29:55PM 3 points [-]

It's worth noting, for 'number of people killed' statistics, that all of those people were going to die anyway, and many of them might have been about to die for some other reason.

Society kills about 56 million people each year from spending resources on things other than solving the 'death' problem.

Comment author: [deleted] 24 February 2012 06:01:38PM 4 points [-]

that all of those people were going to die anyway

Some of whom several decades later. (Loss of QALYs would be a better statistic, and I think it would be non-negligible.)