Eliezer_Yudkowsky comments on Ethics as a black box function - Less Wrong

11 Post author: Kaj_Sotala 22 September 2009 05:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (30)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 22 September 2009 07:31:20PM 5 points [-]

Fallible relative to what?

Comment author: CronoDAS 22 September 2009 08:00:58PM 0 points [-]
Comment author: Matt_Simpson 22 September 2009 08:46:20PM *  1 point [-]

Skimming around his site, it's interesting, but I think he made a basic mistake

From here:

Act utilitarianism not only requires no desire for alcohol, it requires no desire for anything other than to maximize utility. If the agent likes the taste of steak better than hamburger, then there will be an instance in which he will sacrifice maximum utility for a steak. If he has a strong preference, it will have the same effect as a strong preference for alcohol. If he has an aversion to pain, a desire for sex, a particular interest in the well being of his children, there are instances in which she will sacrifice her desire to maximize utility to obtain fulfillment of any of these other desires.

I hold that a moral commandment to act as an act-utilitarian is no different than a commandment to alter the gravitational constant to a number that maximizes utility, or a commandment to move the Earth to an orbit that would produce a more pleasing climate. If it cannot be done, there is no sense in saying that it ought to be done.

Of course, the definition of my utility function will include a term for steaks, or alcohol, or whatever intrinsic value they help me achieve. Maximizing utility is not, therefore, contradictory to valuing a steak. My desire to maximize utility includes my desire to eat steak (or whatever intrinsic value it helps me attain).

This seems like a real simple mistake, so maybe I am simply misunderstanding him. Anyone who knows his work better care to comment (at least before I have more time to poke around his site some more)?

Comment author: SilasBarta 22 September 2009 09:27:07PM *  0 points [-]

I didn't read any more of his site, but just from the excerpt you gave, her [1] point seems to be that if you value total utility, then you will have to deprive yourself to benefit people in general, which people can't do -- they inevitably act as if their own utility carries more weight than that of others.

[1] Hey, if he can use pronouns confusingly and inconsistently, so can we!

Comment author: CronoDAS 22 September 2009 09:34:07PM 1 point [-]

Fyfe annoys me sometimes because he continuously ignores my requests to express concepts in mathematical language.

Comment author: Vladimir_Nesov 22 September 2009 07:36:57PM 0 points [-]

"Reasoned argument", it says.

Comment author: Jayson_Virissimo 22 September 2009 07:38:48PM 2 points [-]

And how does that help if the premises in your "reasoned argument" are arrived at via intuition?