Doug_S. comments on Moral Error and Moral Disagreement - Less Wrong

14 Post author: Eliezer_Yudkowsky 10 August 2008 11:32PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (125)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Doug_S. 12 August 2008 12:32:17AM 2 points [-]

I, too, wonder if the "psychological unity of humankind" has been a bit overstated. All [insert brand and model here] computers have identical hardware, but you can install different software on them. We're running different

Consider the case of a "something maximizer". It's given an object, and then maximizes the number of copies of that object.

You give one something maximizer a paperclip, and it becomes a paperclip maximizer. You give another a pencil, and that one becomes a pencil maximizer.

There's no particular reason to expect the paperclip maximizer and the pencil maximizer to agree on values, even though they are both something maximizers, implemented on identical hardware. The extrapolated volition of a something-maximizer is not well-defined.

There are probably blank spaces in human Eliezer::morality that are written on by experience, and such writing may well be irrevocable, or mostly irrevocable. If humans' terminal values are, in fact, contingent on experiences, then you could have two people disagreeing on the same level that the two something maximizers do.

As a practical matter, people generally do seem to have similar sets of terminal values, but different people rank the values differently. Consider the case of "honor". Is it better to die with honor or to live on in disgrace? People raised in different cultures will give different answers to this question.