kaz comments on The "Intuitions" Behind "Utilitarianism" - Less Wrong

29 Post author: Eliezer_Yudkowsky 28 January 2008 04:29PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (193)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: bgaesop 22 August 2011 07:32:51AM 2 points [-]

if the disutility of an air molecule slamming into your eye were 1 over Graham's number, enough air pressure to kill you would have negligible disutility.

Yes, this seems like a good argument that we can't add up disutility for things like "being bumped into by particle type X" linearly. In fact, it seems like having 1, or even (whatever large number I breathe in a day) molecules of air bumping into me is a good thing, and so we can't just talk about things like "the disutility of being bumped into by kinds of particles".

If your utility function ceases to correspond to utility at extreme values, isn't it more of an approximation of utility than actual utility?

Yeah, of course. Why, do you know of some way to accurately access someone's actually-existing Utility Function in a way that doesn't just produce an approximation of an idealization of how ape brains work? Because me, I'm sitting over here using an ape brain to model itself, and this particular ape doesn't even really expect to leave this planet or encounter or affect more than a few billion people, much less 3^^^3. So it's totally fine using something accurate to a few significant figures, trying to minimize errors that would have noticeable effects on these scales.

Sure, you don't need a model that works at the extremes - but when a model does hold for extreme values, that's generally a good sign for the accuracy of the model.

Yes, I agree. Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that's a bad sign for your model.

doesn't that assign higher impact to five seconds of pain for a twenty-year old who will die at 40 than to a twenty-year old who will die at 120? Does that make sense?

Yeah, absolutely, I definitely agree with that.

Comment author: kaz 26 August 2011 01:58:46AM 0 points [-]

Given that your model is failing at these extreme values and telling you to torture people instead of blink, I think that's a bad sign for your model.

That would be failing, but 3^^^3 people blinking != you blinking. You just don't comprehend the size of 3^^^3.

Yeah, absolutely, I definitely agree with that.

Well it's self evident that that's silly. So, there's that.