LauraABJ comments on The AI in a box boxes you - Less Wrong

102 Post author: Stuart_Armstrong 02 February 2010 10:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (378)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stuart_Armstrong 02 February 2010 01:50:50PM 2 points [-]

As long as the probability of it saying the truth is positive, it could up the number of copies of you it tortues/claims to torture (and torture them all in subtly different ways)...

Comment author: LauraABJ 02 February 2010 03:07:47PM 7 points [-]

Pascal's mugging...

Anyway, if you are sure you are going to hit the reset button every time, then there's no reason to worry, since the torture will end as soon as the real copy of you hits reset. If you don't, then the whole world is absolutely screwed (including you), so you're a stupid bastard anyway.

Comment author: byrnema 02 February 2010 04:58:00PM 5 points [-]

Yes, the copies are depending upon you to hit reset, and so is the world.

Comment author: jacob_cannell 04 February 2011 05:22:55AM 4 points [-]

That would only be correct if hitting the reset button somehow kills or stops the AI.

If you don't have the power to kill/stop it, then the problem is somewhat more interesting.