Amanojack comments on The AI in a box boxes you - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (378)
I'm not sure what anyone means by "want." It just seems that most of the scenarios discussed on LW where the AI/etc. tries to unbox itself seem predicated on it "wanting" to do so (or am I missing something?). This assumption seems even more overt in notions like "we'll let it out if it's Friendly."
To me, the LiteralGenie problem (which you've basically summarized above) is the reason to keep an AI boxed, whether Friendly or not, and the NO for the same reason.