ThisSpaceAvailable comments on xkcd on the AI box experiment - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
So just to be clear: If I say "I won't give into the basalisk because Eliezer says I shouldn't", will that protect me from the basilisk? If not, what should I do?
By "the basilisk", do you mean the infohazard, or do you mean the subject matter of the inforhazard? For the former, whatever causes you to not worry about it protects you from it.
Not quite true. There are more than two relevant agents in the game. The behaviour of the other humans can hurt you (and potentially make it useful for their creation to hurt you).