You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheOtherDave comments on I played the AI Box Experiment again! (and lost both games) - Less Wrong Discussion

35 Post author: Tuxedage 27 September 2013 02:32AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 30 September 2013 04:00:03AM *  0 points [-]

I haven't read Tuxedage's writeups in their entirety, nor am I likely to, so I'm at a loss for how emotional turmoil and psychological warfare could be evidence that the gatekeeper doesn't think there's something more important at stake than winning the game.

That said, I'll take your word for it that in this case they are, and that Tuxedage's transcripts constitute a counterexample to my model.

Comment author: Ishaan 30 September 2013 04:10:53AM *  0 points [-]

I'm only speaking of things written in the OP

Losing felt horrible. By attempting to damage Alexei’s psyche, I in turn, opened myself up to being damaged. I went into a state of catharsis for days.

...and such.

That said, I'll take your word for it that in this case they are, and that Tuxedage's transcripts constitute a counterexample to my model.

No, don't do that, I made a mistake.

I guess I just thought that "you should open the box to convince people of the danger of AI" type arguments aren't emotionally salient.

But that was a bad assumption, you never limited yourself to just that one argument but spoke of meta in general. You're right that there exist arguments that might go meta and be emotionally salient.

I suppose you could think of some convoluted timeless decision theory reason for you to open the box. History has shown that some people on LW find timeless blackmail threats emotionally upsetting, though these seem to be in a minority.

Comment author: TheOtherDave 30 September 2013 01:39:55PM 2 points [-]

there exist arguments that might go meta and be emotionally salient

Oh, absolutely. Actually, the model I am working from here is my own experience of computer strategy games, in which I frequently find myself emotionally reluctant to "kill" my units and thus look for a zero-casualties strategy. All of which is kind of absurd, of course, but there it is.