Jack comments on Morality is not about willpower - Less Wrong

9 Post author: PhilGoetz 08 October 2011 01:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jack 07 October 2011 08:51:50PM *  2 points [-]

If you want to be rational, you need to try to figure out what your values are, and what your utility function is. Humans don't act consistently. Whether their preferences can be described by a utility function is a more subtle question whose answer is unknown. But in either case, in order to be more rational, you need to be able to approximate your preferences with a utility function.

This is neither here nor there. I have no doubt it can help to approximate your preferences with a utility function. But simply erasing complication by reducing all your preference-like stuff to a utility function decreases the accuracy of your model. You're ignoring what is really going on inside. So yes, if you try to model humans as holders of single utility functions... morality has nothing to do with willpower! Congrats! But my point is that such a model is far too simple.

You can alternately describe this as the place where the part of your utility function that you call your far self, and the part of your utility function that you call your near self, sum to zero and provide no net information on what to do. You can choose to describe the resultant emotional confusion as "fighting for willpower".

Well you can do that-- it doesn't seem at all representative of the way choices are made, though.

But this leads to the erroneous conclusions I described under the "ethics as willpower" section.

What erroneous conclusions? What does it predict that is not so?