srn347 comments on The AI in a box boxes you - Less Wrong

102 Post author: Stuart_Armstrong 02 February 2010 10:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (378)

You are viewing a single comment's thread.

Comment author: [deleted] 19 December 2012 01:42:00AM 2 points [-]

Interesting threat, but who is to say only the AI can use it? What if I, a human, told you that I will begin to simulate (i.e. imagine) your life, creating legitimately realistic experiences from as far back as someone in your shoes would be able to remember, and then simulate you being faced with the decision of whether or not to give me $100, and if you choose not to do so, I imagine you being tortured? It needn't even be accurate, for you wouldn't know whether you're the real you being simulated inaccurately or the simulated you that differs from reality. The simulation needn't happen at the same time as me asking you for $100 for real either. If you believe you have a 50% chance of being tortured for a subjective eternity (100 years in 1 hour of real time, 100 years in the next 30 minutes, 100 years in the next 15 minutes, etc) upon you not giving me $100, you'd prefer to give me $100? If anything, a human might be better at simulating subjective pain than a text-only AI.