PhilGoetz comments on Morality is not about willpower - Less Wrong

9 Post author: PhilGoetz 08 October 2011 01:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (144)

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 06 October 2011 05:30:34AM 2 points [-]

It helps to define your terms before philosophizing. I assume that you mean morality(a collection of beliefs as to what constitutes a good life) when you write ethics.

I can't speak for you, but my moral views are originally based on what I was taught by my family and the society in general, explicitly and implicitly, and then developed based on my reasoning and experience. Thus, my personal moral subsystem is compatible with, but not identical to what other people around me have. The differences might be minor (is torrenting copyrighted movies immoral?) or major (is hit-and-run immoral?).

Abiding by my personal morality is sometimes natural (like your "taste"), and at other times requires immense willpower. I have noticed that there is also a certain innate component to it.

Sometimes I change my moral views when new convincing information comes along. I do not think of them in terms of some abstract utility function, but rather as a set of rules of how to be good in my own eyes, though they would probably contribute to one number when properly weighted. I don't bother doing it, though, and I suspect it is the same for other people.

I am yet to see anyone proclaim "My moral utility function spiked from the last week's average of 117 to 125 today, when I helped an old lady cross the street." Sure sounds like something two AIs talking to each other would boast about, though.

Comment author: PhilGoetz 06 October 2011 05:43:07AM *  2 points [-]

It helps to define your terms before philosophizing. I assume that you mean morality(a collection of beliefs as to what constitutes a good life) when you write ethics.

"Morality" is cognate with "mores", and has connotations of being a cultural construct (what I called social ethics) that an individual is not bound to (e.g, "When in Rome, do as the Romans do"). But my real answer is that neither of these terms are defined clearly enough for me to worry much over which one I used. I hope you found all terms sufficiently defined by the time you reached the end.

When you say you developed your morals based on reasoning and experience, how did reason help? Reasoning requires a goal. I don't think you can reason your way to a new terminal goal; so what do you mean when you say reasoning helped develop your morals? That it helped you know how better to achieve your goals, but without giving you any new goals?

If you say something like, "Reason taught me that I should value the wants of others as I value my own wants", I won't believe you. Reason can't do that. It might teach you that your own wants will be better-satisfied if you help other people with their wants. But that's different.

As for myself, everything I call "moral" comes directly out of my wants. I want things for myself, and I also want other people to be happy, and to like me. Everything else follows from that. I may have had to be conditioned to care about other people. I don't know. That's a nature/nurture argument.

Comment author: Eugine_Nier 06 October 2011 06:20:04AM *  1 point [-]

If you say something like, "Reason taught me that I should value the wants of others as I value my own wants", I won't believe you. Reason can't do that. It might teach you that your own wants will be better-satisfied if you help other people with their wants. But that's different.

You may have an overly narrow view of what is usually meant by the word "reason".

Comment author: PhilGoetz 06 October 2011 02:26:06PM 2 points [-]

No. Saying that reason taught you a new value is exactly the same as saying that you logically concluded to change your utility function.