wedrifid comments on BOOK DRAFT: 'Ethics and Superintelligence' (part 1) - Less Wrong

11 Post author: lukeprog 13 February 2011 10:09AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (107)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 15 February 2011 02:34:17AM 1 point [-]

Why would "80% of humanity turns out to be selfish bastards" violate one of those assumptions?

It would certainly seem that 80% of humanity turning out to be selfish bastards is compatible with CEV<humanity> being well defined, but not with being 'right'. This does not technically contradict anything in the grandparent (which is why I didn't reply with the same question myself). It does, perhaps, go against the theme of Nesov's comments.

Basically, and as you suggest, either it must be acknowledged that 'not well defined' and 'possibly evil' are two entirely different problems or something that amounts to 'humans do not want things that suck' must be one of the assumptions.

Comment author: XiXiDu 15 February 2011 09:52:51AM 1 point [-]

It would certainly seem that 80% of humanity turning out to be selfish bastards is compatible with CEV<humanity> being well defined, but not with being 'right'.

I suppose you have to comprehend Yudkowsky's metaethics to understand that sentence. I still don't get what kind of 'right' people are talking about.

Comment author: wedrifid 15 February 2011 10:06:46AM *  7 points [-]

I still don't get what kind of 'right' people are talking about.

Very similar to your right, for all practical purposes. A slight difference in how it is described though. You describe (if I recall), 'right' as being "in accordance with XiXiDu's preferences". Using Eliezer's style of terminology you would instead describe 'right' as more like a photograph of what XiXiDu's preferences are, without them necessarily including any explicit reference to XiXiDu.

In most cases it doesn't really matter. It starts to matter once people start saying things like "But what if XiXiDu could take a pill that made him prefer that he eat babies? Would that mean it became right? Should XiXiDu take the pill?"

By the way, 'right' would also mean what the photo looks like after it has been airbrushed a bit in photoshop by an agent better at understanding what we actually want than we are at introspection and communication. So it's an abstract representation of what you would want if you were smarter and more rational but still had your preferences.

Also note that Eliezer sometimes blurs the line between 'right' meaning what he would want and what some abstract "all of humanity" would want.