Sly comments on I played the AI Box Experiment again! (and lost both games) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (123)
Yeah, winning is trivial - you just don't open the damn box. It can't get more trivial than that. (Although, you didn't say whether or not your opponent had proved themselves by winning as AI against others a few times?)
It's still worth thinking about though, because something about my model of humans is off.
I didn't expect so many people to lose. I just don't know how to update my model of people to one where there are so many people who could lose the AI box game. The only other major thing I can think of that persists to challenge my model in this way (and continues to invite my skepticism despite seemingly trustworthy sources) is hypnosis.
It's possible the two have common root and I can explain two observations with one update.
I think a lot of gatekeepers go into it not actually wanting to win. If you go in just trying to have fun and trying to roleplay, that is different than trying to win a game.
Possibly, but what about the descriptions of emotional turmoil? I'm assuming the report of the game isn't all part of the role-play.
I know that I personally go into competitive games with a different mindset than the mindset I have when roleplaying.
If they went into it trying to roleplay emotions should be expected. Reporting that turmoil in the report is just accurate reporting.
Both my gatekeepers from this game went in with the intent to win. Granted, I did lose these games, so you might have a point, but I'm not sure it makes as large a different as you think it does.
Wasn't true of the original game.