DuncanS comments on AI box: AI has one shot at avoiding destruction - what might it say? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (354)
Then the following command shouldn't bother you! :)
AI DESTROYED
Which leads to two possible futures. In one of them, the AI us destroyed, and nothing else happens. In the other, you receive a reply to your command thus.
The command did not. But your attitude - I shall have to make an example of you.
Obviously not a strategy to get you to let the AI out based on its friendliness - quite the reverse.
I'd rather die to an already-unboxed UFAI than risk letting a UFAI out in the first place. My life is worth VASTLY less than the whole of humanity.