Eugine_Nier comments on Morality is not about willpower - Less Wrong

9 Post author: PhilGoetz 08 October 2011 01:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 06 October 2011 02:56:05PM *  -1 points [-]

This is all true. But humans do not have utility functions... Humans are not the coherent, consistent agents you make them out to be.

If you think that's relevant, you should also go write the same comment on Eliezer's post on utilons and fuzzies. Having two coherent, consistent utility functions is no more realistic than having one.

If you want to be rational, you need to try to figure out what your values are, and what your utility function is. Humans don't act consistently. Whether their preferences can be described by a utility function is a more subtle question whose answer is unknown. But in either case, in order to be more rational, you need to be able to approximate your preferences with a utility function.

Fighting for willpower is basically your far-self trying to wrest control of your behavior from your near-self.

You can alternately describe this as the place where the part of your utility function that you call your far self, and the part of your utility function that you call your near self, sum to zero and provide no net information on what to do. You can choose to describe the resultant emotional confusion as "fighting for willpower". But this leads to the erroneous conclusions I described under the "ethics as willpower" section.

Comment author: Eugine_Nier 07 October 2011 12:48:46AM 2 points [-]

Having two coherent, consistent utility functions is no more realistic than having one.

He never said these "utility functions" are coherent. In fact a large part of the problem is that the "fuzzies" utility function is extremely incoherent.

Comment author: PhilGoetz 07 October 2011 04:02:17AM -1 points [-]

You keep using that word. I do not think it means what you think it means. A utility function that is incoherent is not a utility function.

If it is acceptable for Eliezer to talk about having two utility functions, one that measures utilons and one that measures fuzzies, then it is equally acceptable to talk about having a single utility function, with respect to the question of whether humans are capable of having utility functions.

Comment author: Eugine_Nier 08 October 2011 04:32:30AM 1 point [-]

A utility function that is incoherent is not a utility function.

I was using the same not-quite strict definition of "utility function" that you seemed to be using in your post. In any case, I don't believe Eliezer ever called fuzzies a utility function.