You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

buybuydandavis comments on Population Ethics Shouldn't Be About Maximizing Utility - Less Wrong Discussion

0 Post author: Ghatanathoah 18 March 2013 02:35AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

You are viewing a single comment's thread. Show more comments above.

Comment author: buybuydandavis 19 March 2013 05:05:55AM 0 points [-]

You seem to be using the word "morality" as a synonym for "values" or "utility function."

Often I do, particularly when talking about moralities other than human.

Other times I will make what I consider a relevant functional distinction between general values and moral values in humans - moral values involves approval/disapproval in a recursive way - punishing an attitude, punishing failure to punish an attitude, punishing a failure to punish the failure to punish an attitude. etc. And similarly for rewards.

Elaborating on your point about supporting the continued existence of those who support your morality, on purely consequentialist grounds, Clippy is likely to engage in this kind of recursive punishment/reward as well, so he wouldn't be entirely a Lovecraftian horror until the power of humans became inconsequential to his ends.

consider morality to be a sort of abstract concept that concerns the wellbeing of people

Likely Clippy will be "concerned" about the well being of people to the extent that such well being concerns the making of paperclips. Does that make him moral, according to your usage?

What kind of concern counts, and for which people must it apply? Concern is a very broad concept, with the alternative being unconcerned or indifferent. Clippy isn't likely to be indifferent, and neither are sociopaths or baby eaters. Sociopaths and babyeaters are likely concerned about their own wellbeing, at least.

Comment author: Ghatanathoah 19 March 2013 05:31:03AM 2 points [-]

Clippy is likely to engage in this kind of recursive punishment/reward as well, so he wouldn't be entirely a Lovecraftian horror until the power of humans became inconsequential to his ends.

I don't deny that it may be useful to create some sort of creature with nonhuman values for instrumental reasons. We do that all the time now. For instance, we create domestic cats, which if they are neurologically advanced enough to have values at all, are probably rather sociopathic; because of their instrumental value as companions and mousers.

Likely Clippy will be "concerned" about the well being of people to the extent that such well being concerns the making of paperclips. Does that make him moral, according to your usage?

Sorry, I should have been clearer. In order to count as morality the concern for the wellbeing of others has to be a terminal value. In Clippy's case his concern is only an instrumental value, so he isn't moral.