Morendil comments on Discussion: Yudkowsky's actual accomplishments besides divulgation - Less Wrong

31 Post author: Raw_Power 25 June 2011 11:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (115)

You are viewing a single comment's thread. Show more comments above.

Comment author: Morendil 26 June 2011 01:17:41PM 19 points [-]

The record of AI box experiments (those involving Eliezer) is as follows:

  • Experiment 1, vs Nathan Russell - AI win
  • Experiment 2, vs David McFadzean - AI win
  • Experiment 3, vs Carl Shulman - AI win
  • Experiment 4, vs Russell Wallace - GK win
  • Experiment 5, vs D. Alex - GK win
Comment author: CarlShulman 27 June 2011 09:54:30PM 8 points [-]

The last three experiments had bigger (more than 2 orders of magnitude, I think) outside cash stakes. I suspect Russell and D. Alex may have been less indifferent about that than me, i.e. I think the record shows that Eliezer acquitted himself well with low stakes ($10, or more when the player is indifferent about the money) a few times, but failed with high stakes.

Comment author: Vaniver 28 June 2011 08:56:22PM 8 points [-]

I think the record shows that Eliezer acquitted himself well with low stakes ($10, or more when the player is indifferent about the money) a few times, but failed with high stakes.

Which suggests to me that as soon as people actually feel a bit of real fear- rather than just role-playing- they become mostly immune to Eliezer's charms.

Comment author: Desrtopa 15 November 2011 01:04:00AM -1 points [-]

With an actual boxed AI though, you probably want to let it out if it's Friendly. It's possibly the ultimate high stakes gamble. Certainly you have more to be afraid of than with a low stakes roleplay, but you also have a lot more to gain.