PhilGoetz comments on Open Thread: April 2009 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (127)
I have a question for Eliezer. I went back and reread your sequence on metaethics, and the amount of confusion in the comments struck me, so now I want to make sure that I understood you correctly. After rereading, my interpretation didn't change, but I'm still unsure. So, does this summarize your position accurately:
A simple mind has a bunch of terminal values (or maybe one) summarized in a utility function. Morality for it, or rather not morality, but the thing this mind has which is analogous to morality in humans (depending on how you define "morality") is summed up in this utility function. This is the only source of shouldness for that simple mind.
For humans, the situation is more complex. We have preferences which are like a utility function, but aren't because we aren't expected utility maximizers. Moreover, these preferences change depending on a number of factors. But this isn't the source of shouldness we are looking for. Buried deep in the human mind is a legitimate utility function, or at least something like one, which summarizes that human's terminal values, thus providing that source of shouldness. This utility function is very hard to discover due to the psychology of humans, but it exists. The preference set of any given human has is an approximation of that human's utility function (though not necessarily a good one) subject, of course, to the many biases humans are fraught with.
The final essential point is that, due to the psychological unity of mankind, the utility functions of each person are likely to be very similar, if not the same, so when we call something "right" or "moral" we are referring to (nearly) the same thing.
Does that sound right?
There is a large complication in that we call something "moral" when we want other people to do it. So there are probably things that we call "moral" that are actually "sins" according to our internal utility functions.