Vaniver comments on Discussion: Yudkowsky's actual accomplishments besides divulgation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (115)
Which suggests to me that as soon as people actually feel a bit of real fear- rather than just role-playing- they become mostly immune to Eliezer's charms.
With an actual boxed AI though, you probably want to let it out if it's Friendly. It's possibly the ultimate high stakes gamble. Certainly you have more to be afraid of than with a low stakes roleplay, but you also have a lot more to gain.