You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

James_Miller comments on Seeking Estimates for P(Hell) - Less Wrong Discussion

4 Post author: Mac 21 March 2015 03:44PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread.

Comment author: James_Miller 21 March 2015 04:02:40PM *  4 points [-]

A torturing AI is most likely to happen as a result of deliberate human actions because in many types of negotiations you want to threaten your opponents with the worst possible punishment, for example, convert your population to my religion or I will subject your population to eternal torture by my AIs.