You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

PhilGoetz comments on Superintelligence 23: Coherent extrapolated volition - Less Wrong Discussion

5 Post author: KatjaGrace 17 February 2015 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (97)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 20 February 2015 07:18:26PM *  1 point [-]

So a universe that has the condition to produce some things like humans, that is "the state of nature" from which UAI will arise and, if they are as good as we are afraid they are, supplant humans as the dominant species.

That's the goal. What, you want there to be humans a million years from now?

Comment author: mwengler 24 February 2015 02:31:18PM 1 point [-]

That's the goal. What, you want there to be humans a million years from now?

Is that true, or are you just being cleverly sarcastic? If that is the goal of CEV, could you point me to something written up on CEV where I might see this aspect of it?

Comment author: PhilGoetz 11 March 2015 04:41:15PM *  0 points [-]

I mean, that's the goal of anyone with morals like mine, rather than just nepotism.