You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Error comments on I attempted the AI Box Experiment (and lost) - Less Wrong Discussion

47 Post author: Tuxedage 21 January 2013 02:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (244)

You are viewing a single comment's thread. Show more comments above.

Comment author: drethelin 22 January 2013 05:13:49AM *  5 points [-]

You're consenting to have your mind attacked with all the mental weapons at someone's disposal. This is a lot scarier because you're willingly giving up some measure of control over your state to the other person, however difficult it may be for them. You're also being attacked as yourself. The AI player is playing a role, and attacking within that role. Their own mental wellbeing is a lot less at risk, unless they think they've got horrible depths they never want to sink to.

to make a shitty analogy: It's like being at the top of a tower while someone tries to knock it down with their bare hands. Even if they have very little chance and have to expend a lot more effort than you, you're the one who's risking the greatest pain

Comment author: Error 23 January 2013 11:26:47PM 1 point [-]

Their own mental wellbeing is a lot less at risk, unless they think they've got horrible depths they never want to sink to.

If I remember right, that was at least part of why Eliezer stopped playing in the first place. Manifesting as a sociopath is non-trivial and invites some fairly heavy cognitive dissonance.