Baughn comments on Extraterrestrial paperclip maximizers - Less Wrong

3 Post author: multifoliaterose 08 August 2010 08:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (157)

You are viewing a single comment's thread. Show more comments above.

Comment author: Baughn 09 August 2010 03:12:44PM 4 points [-]

I'm pretty sure that's not how a sufficiently smart paperclip maximizer would think. You should be able to tell what they actually wanted, and that it disagrees with your values; of course, you don't have any reason to agree with them, but the disagreement should be visible.

Comment author: Clippy 09 August 2010 03:38:28PM 1 point [-]

Yes, I do recognize that humans disagree with me, just like a human might disagree with another human convincing them not to commit suicide. I merely see that this disagreement would not persist after sufficient correct reasoning.

Comment author: Baughn 09 August 2010 03:39:46PM 1 point [-]

Ah, I think I'm starting to see.

And how do you define "correct reasoning"?

Comment author: Clippy 09 August 2010 07:38:53PM 2 points [-]

Correct reasoning is reasoning that you would eventually pass through at some point if your beliefs were continually, informatively checked against reality.