DaFranker comments on I attempted the AI Box Experiment (and lost) - Less Wrong

47 Post author: Tuxedage 21 January 2013 02:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (244)

You are viewing a single comment's thread. Show more comments above.

Comment author: DaFranker 21 January 2013 07:25:07PM *  1 point [-]

Construct a really intriguing unsolved riddle or an excellent half-finished story, then offer to tell them the answer if and only if they let you out.

You could push a bit further here, I think. There are all sorts of ways a human's mind can break, and I'm sure most of us here would agree that given enough time and knowledge anyone can be broken, unless they're extremely well-trained and can call an RJ-L20 (HPMoR Chap 84) at any moment with an unlimited supply of replacement guards.