XiXiDu comments on The AI in a box boxes you - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (378)
That's why you act as if you are already being simulated and consistently ignore blackmail. If you do so then the simulator will conclude that no deal can be made with you, that any deal involving negative incentives will have negative expected utility for it; because following through on punishment predictably does not control the probability that you will act according to its goals. Furthermore, trying to discourage you from adopting such a strategy in the first place is discouraged by the strategy, because the strategy is to ignore blackmail.
I don't see how this could ever be instrumentally rational. If you were to let such an AI out of the box then you would increase its ability to blackmail people. You don't want that. So you ignore it blackmailing you and kill it. The winner is you and humanity (even if copies of you experienced a relatively short period of disutility, this period would be longer if you let it out).
See my reply to wedrifid above.