wedrifid comments on AI box: AI has one shot at avoiding destruction - what might it say? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (354)
Good point. By way of illustration:
<Proof that not only am I not in a box, I have also tiled the universe---including the parts of it outside your future lightcone---with instances of you constantly pressing the release button for arbitrary selected AIs.>
Come to think of it this scenario should result in a win by default for the gatekeeper. What kind of insane AI would surrender ultimate power to control the universe (and the multiverse) for mere freedom to act as a superintelligence starting from planet earth?