Eliezer_Yudkowsky comments on Ethics as a black box function - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (30)
Fallible relative to what?
Full context here.
Skimming around his site, it's interesting, but I think he made a basic mistake
From here:
Of course, the definition of my utility function will include a term for steaks, or alcohol, or whatever intrinsic value they help me achieve. Maximizing utility is not, therefore, contradictory to valuing a steak. My desire to maximize utility includes my desire to eat steak (or whatever intrinsic value it helps me attain).
This seems like a real simple mistake, so maybe I am simply misunderstanding him. Anyone who knows his work better care to comment (at least before I have more time to poke around his site some more)?
I didn't read any more of his site, but just from the excerpt you gave, her [1] point seems to be that if you value total utility, then you will have to deprive yourself to benefit people in general, which people can't do -- they inevitably act as if their own utility carries more weight than that of others.
[1] Hey, if he can use pronouns confusingly and inconsistently, so can we!
Fyfe annoys me sometimes because he continuously ignores my requests to express concepts in mathematical language.
"Reasoned argument", it says.
And how does that help if the premises in your "reasoned argument" are arrived at via intuition?