Wei_Dai comments on The Meaning of Right - Less Wrong

30 Post author: Eliezer_Yudkowsky 29 July 2008 01:28AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (147)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 11 September 2009 01:37:22AM 4 points [-]

What makes you think that any coherence exists in the first place?

Most people wouldn't want to be turned into paperclips?

Comment author: Wei_Dai 11 September 2009 04:17:27AM *  27 points [-]

Most people wouldn't want to be turned into paperclips?

Of course not, since they haven't yet heard the argument that would make they want to. All the moral arguments we've heard so far have been invented by humans, and we just aren't that inventive. Even so, we have Voluntary Human Extinction Movement.

Comment author: Eliezer_Yudkowsky 11 September 2009 10:05:46PM 13 points [-]

Wei, suppose I want to help someone. How ought I to do so?

Is the idea here that humans end up anywhere depending on what arguments they hear in what order, without the overall map of all possible argument orders displaying any sort of concentration in one or more clusters where lots of endpoints would light up, or any sort of coherency that could be extracted out of it?

Comment author: Wei_Dai 11 September 2009 10:29:02PM 27 points [-]

Wei, suppose I want to help someone. How ought I to do so?

I don't know. (I mean I don't know how to do it in general. There are some specific situations where I do know how to help, but lots more where I don't.)

Is the idea here that humans end up anywhere depending on what arguments they hear in what order, without the overall map of all possible argument orders displaying any sort of concentration in one or more clusters where lots of endpoints would light up, or any sort of coherency that could be extracted out of it?

Yes. Or another possibility is that the overall map of all possible argument orders does display some sort of concentration, but that concentration is morally irrelevant. Human minds were never "designed" to hear all possible moral arguments, so where the concentration occurs is accidental, and perhaps horrifying from our current perspective. (Suppose the concentration turns out to be voluntary extinction or something worse, would you bite the bullet and let the FAI run with it?)