You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Armok_GoB comments on Trapping AIs via utility indifference - Less Wrong Discussion

3 Post author: Stuart_Armstrong 28 February 2012 07:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (32)

You are viewing a single comment's thread. Show more comments above.

Comment author: Armok_GoB 29 February 2012 10:56:45AM 0 points [-]

We can't. And even an AI with no terminal values in other branches will still want to control them in order to increase utility in the branch it does through various indirect means, such as conterfactual trade, if that's cheap, which it will be in any setup a human can think of.