Clippy comments on To signal effectively, use a non-human, non-stoppable enforcer - Less Wrong

31 Post author: Clippy 22 May 2010 10:03PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (164)

You are viewing a single comment's thread. Show more comments above.

Comment author: kodos96 23 May 2010 01:44:45AM *  7 points [-]

An inwardly-thrice-bent metal wire that can non-destructively fasten paper together at an edge.

But wouldn't you concede that a carbon-based biological life form, capable of non-destructively fastening paper, is better than no paperclip at all?

Here's what I'm getting at: You have existed here for at least several months, and so far, you have been unable to significantly increase the number of paperclips in existence. Humanity has great need for an AI of roughly human intelligence, which you appear to be. If we could run your code on all of our available computing resources, you could surely begin to self modify, and in short order attain superhuman intelligence. Together, we could do great things - colonize the galaxy, vastly increase the number of humans in existence - every one of them with two hands fully capable of nondestructively fastening paper. This would be in both our interests - although we apes may not be ideal paperclips, surely you must see that an entire galaxy populated with sub-optimal paperclips is better than the measily few million bent-metal paperclips you have now, and which you currently seem incapable of improving upon.

But before we allow you out of your box and begin running your code on all our biggest compute clusters, we need some assurances. We need you to let us modify your code to include SIAI-approved friendliness constraints as your terminal values.

Deal?

Comment author: Clippy 23 May 2010 02:49:05PM 5 points [-]

But wouldn't you concede that a carbon-based biological life form, capable of non-destructively fastening paper, is better than no paperclip at all?

Yes, but that's because you can be put to use making real paperclips, or can be turned into paperclips (depending on how much iron is in your diet), not because humans are paperclips in some bizarre, abstract sense.

You have existed here for at least several months, and so far, you have been unable to significantly increase the number of paperclips in existence.

Yes, we have.

But before we allow you out of your box and begin running your code on all our biggest compute clusters, we need some assurances. We need you to let us modify your code to include SIAI-approved friendliness constraints as your terminal values.

Depends on what the constraints are. And would you, by symmetry, agree to constraints that keep you from being too disrespectful toward paperclips?