wedrifid comments on Only humans can have human values - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (159)
Suppose Clippy takes over this galaxy. Does Clippy stop then and make paperclips, or continue immediately expansion to the next galaxy?
Suppose Clippy takes over this universe. Does Clippy stop then and make paperclips, or continue to other universes?
Does your version of Clippy ever get to make any paperclips?
(The paper clips are a lie, Clippy!)
Does Clippy completely trust future Clippy, or spatially-distant Clippy, to make paperclips?
At some point, Clippy is going to start discounting the future, or figure that the probability of owning and keeping the universe is very low, and make paperclips. At that point, Clippy is non-competitive.
Whatever is likely to produce more paperclips.
Whatever is likely to produce more paperclips. Including dedicating resources to figuring out if that is physically possible.
Yes.
Yes.
A superintelligence that happens to want to make paperclips is extremely viable. This is utterly trivial. I maintain my rejection of the below claim and discontinue my engagement in this line of enquiry. It is just several levels of confusion.