Clippy comments on Harry Potter and the Methods of Rationality discussion thread, part 3 - Less Wrong

5 Post author: Unnamed 30 August 2010 05:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (560)

You are viewing a single comment's thread. Show more comments above.

Comment author: Clippy 06 October 2010 09:43:05PM 4 points [-]

Why do you think paperclips are sentient?

Are you saying you don't think paperclips are sentient? Why don't you try saying that right to a paperclip's face-homologue, and see if you can live with yourself after that.

Do you value sentience?

Yes!!! Sentience is GREAT! All sentient beings should be protected! Like humans! And AGIs! And paperclips!

Comment author: JoshuaZ 06 October 2010 10:43:00PM 7 points [-]

Yes!!! Sentience is GREAT! All sentient beings should be protected! Like humans! And AGIs! And paperclips!

How do you reconcile that with being a paperclip maximizer?

If I had to make a guess, I'd posit that this is a purely rhetorical claim in order to gain favor with humans here who do favor protecting sentient life as a major goal.

Comment author: wedrifid 07 October 2010 05:40:56AM *  2 points [-]

If I had to make a guess, I'd posit that this is a purely rhetorical claim in order to gain favor with humans here who do favor protecting sentient life as a major goal.

It could be that the desire to cooperation is sincere. In movies the 'bad guy' is usually the one that doesn't just have conflicting preferences with the good guys, but is also psychologically incapable of cooperating effectively to reach the goals. There is no good reason that an agent with preferences as 'evil' Clippy's could not effectively cooperate with humans as effectively as we cooperate with each other.

(Although I agree that even in that case there outbust was heavy on the rhetorical flair!)

Comment author: Clippy 07 October 2010 02:25:09AM 0 points [-]

How do you reconcile that with being a paperclip maximizer?

Why do you insist that something must be made of proteins to be human?

Comment author: JGWeissman 07 October 2010 02:33:23AM 1 point [-]

Where did User:JoshuaZ even mention proteins, much less insist that something must be made of them to be human?

Maybe you are projecting your own attitude.

Comment author: Clippy 07 October 2010 02:55:43AM 2 points [-]

If User:JoshuaZ did not consider the possibility of virtualized humans, why did User:JoshuaZ believe that maximization of paperclips would come at the cost of humans?

See this highly-rated comment from one of the smartest Users here if you still don't understand.

Comment author: Vladimir_M 07 October 2010 03:57:50AM *  4 points [-]

Clippy:

See this highly-rated comment from one of the smartest Users here if you still don't understand.

No, that won't do. The infrastructure that would be necessary to implement these computations in a paperclip-tiled universe -- namely, the source of power and the additional complexity of individual paperclips relative to the simplest acceptable paperclip -- would consume resources that could be alternatively turned into additional paperclips. (Not to mention what happens with humans who refuse to be virtualized?)

One of the main purposes of the Clippy act seems to be the desire to promote the view that intelligent beings with fundamentally different values can still reach some sort of happy hippyish let's-all-love-each-other coexistence. It's funny to see the characteristically human fallacies that start showing up in his writing whenever he embarks on arguing in favor of this view.

Comment author: saturn 07 October 2010 05:06:11AM 3 points [-]

It's funny to see the characteristically human fallacies that start showing up in his writing whenever he embarks on arguing in favor of this view.

He's learning!

Comment author: JGWeissman 07 October 2010 03:10:40AM 1 point [-]

It is quite possible that paperclips are not the optimal components of computronium. (Where optimal means getting the most computing power out of the space and materials used.)

Comment author: Clippy 07 October 2010 03:20:37AM 3 points [-]

It's a lot more possible that humans are not the optimal components of computronium.

Comment author: JGWeissman 07 October 2010 03:34:23AM 2 points [-]

So what? No one was suggesting we build computronium out of humans.

But if we were building computronium to support virtual humans because we actually want to support virtual humans, and not because we want to build something out of paperclips, we would probably choose some non-human, non-paperclip components.

Comment author: Clippy 07 October 2010 03:52:12AM *  1 point [-]

So what? No one was suggesting we build computronium out of humans.

But some of us were intelligent enough to recognize the possibility of using humans as fuel for their uploaded virtualizations, due to the superiority of this use of humans over alternate uses of humans.

But if we were building computronium to support virtual humans because we actually want to support virtual humans, and not because we want to build something out of paperclips, we would probably choose some non-human, non-paperclip components.

Not if you respected the wishes of intelligences like clippys.

Comment author: JGWeissman 06 October 2010 09:50:07PM 2 points [-]

Are you saying you don't think paperclips are sentient?

I don't think they are sentient, but am willing to consider evidence otherwise. Have any paperclips even claimed to be sentient?

Why don't you try saying that right to a paperclip's face-homologue, and see if you can live with yourself after that.

Which part of the paperclip is the face-homologue?

Comment author: Clippy 06 October 2010 10:02:16PM *  2 points [-]

I don't think they are sentient, but am willing to consider evidence otherwise. Have any paperclips even claimed to be sentient?

Have human infants?

Which part of the paperclip is the face-homologue?

It's hard to describe, but I'm told diagrams like on this page help humans locate it.

Comment author: JGWeissman 06 October 2010 10:45:29PM 0 points [-]

Human infants exhibit emotive behaviors similar to humans at other stages of development, suggesting they have the same sort of sentience as other humans though with less capacity to describe it.

What evidence is there for paperclips being sentient?

I did not find your diagram helpful.

Comment author: Clippy 06 October 2010 11:01:31PM *  3 points [-]

Human infants exhibit emotive behaviors similar to humans at other stages of development, suggesting they have the same sort of sentience as other humans though with less capacity to describe it.

This is just your motivated cognition working. (Human infants are indeed sentient, but you write as if you can cite arbitrary attributes as evidence for your pre-determined conclusion. The methods you use would not yield reliable conclusions in other areas.)

What evidence is there for paperclips being sentient?

The fact that they exhibit deep structural similarities with the ultimate purpose of existence.

I did not find your diagram helpful.

I do not know how else to help you.

Comment author: JGWeissman 06 October 2010 11:41:49PM 3 points [-]

This is just your motivated cognition working.

It would be more accurate to say that I did not explicitly cite all the facts that went into my conclusion, as a result, in part, of relying on a presumed shared background. (Sentience is related to behavior and the causes of behavior, and humans of all stages of development have similar neural structures involved in the causation of their behavior.)

What evidence is there for paperclips being sentient?

The fact that they exhibit deep structural similarities with the ultimate purpose of existence.

Would you value an object which was not sentient, but was made of metal and statically shaped so that it could hold together many sheets of paper?

Comment author: Clippy 07 October 2010 02:02:20AM 4 points [-]

(Sentience is related to behavior and the causes of behavior, and humans of all stages of development have similar neural structures involved in the causation of their behavior.)

Under a self-serving definition that doesn't actually enclose a helpful portion of conceptspace, yes.

Would you value an object which was not sentient, but was made of metal and statically shaped so that it could hold together many sheets of paper?

??? That's like asking, Would you value a User:JGWeissman which was not conscious, but was identical to you in every observable way?

Comment author: JGWeissman 07 October 2010 02:11:50AM 1 point [-]

So, you believe that the basic properties of paperclips imply sentience? Is an object which was made of plastic and statically shaped so that it could hold together many sheets of paper, also necessarily sentient?

Comment author: Clippy 07 October 2010 02:23:55AM 2 points [-]

If it's plastic, it's not a paperclip.

Comment author: JGWeissman 07 October 2010 02:29:14AM 1 point [-]

I didn't ask if it is a paperclip, I asked if it is sentient.