You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

PhilGoetz comments on Superintelligence 23: Coherent extrapolated volition - Less Wrong Discussion

5 Post author: KatjaGrace 17 February 2015 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (97)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 17 February 2015 05:56:14AM *  3 points [-]

I might not mind locking in my current values, but I sure don't want to lock in your current values.

No, more serious: Yes, it would be bad. As I wrote in "The human problem",

Pretty soon your humans will tile your universe with variations on themselves. And the universe you worked so hard over, that you had such high hopes for, will be taken up entirely with creatures that, although they become increasingly computationally powerful, have an emotional repertoire so impoverished that they rarely have any complex positive qualia beyond pleasure, discovery, joy, love, and vellen. What was to be your masterpiece becomes instead an entire universe devoid of fleem.