You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Omid comments on xkcd on the AI box experiment - Less Wrong Discussion

15 Post author: FiftyTwo 21 November 2014 08:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (229)

You are viewing a single comment's thread. Show more comments above.

Comment author: Omid 23 November 2014 02:48:28PM 1 point [-]

So just to be clear: If I say "I won't give into the basalisk because Eliezer says I shouldn't", will that protect me from the basilisk? If not, what should I do?

Comment author: TobyBartels 23 November 2014 10:52:49PM 1 point [-]

If you believe Eliezer, then you may believe him that the basilisk has ~0 probability of occurring. (I should find a citation for that, but I read it just a few minutes ago, somewhere around the discussion of this xkcd comic.) So you are already protected from it, because it does not exist (not even in ways relevant to acausal trade).

More broadly, you should decide to take this approach: never give into blackmail by somebody who knows that you have decided to take this approach. Now they have no incentive to blackmail you, and you are safe, even if they do exist! (I think that the strategy in this paragraph has been endorsed by Eliezer, but don't trust me on that until you get a citation. Until then, you'll have to reason it out for yourself.)

Comment author: Jiro 26 November 2014 10:17:51PM 2 points [-]

Now they have no incentive to blackmail you, and you are safe, even if they do exist!

How does that work if they precommit to blackmail even when there is no incentive (which benefits them by making the blackmail more effective)?

Comment author: ThisSpaceAvailable 26 November 2014 09:09:59AM 0 points [-]

By "the basilisk", do you mean the infohazard, or do you mean the subject matter of the inforhazard? For the former, whatever causes you to not worry about it protects you from it.

Comment author: wedrifid 26 November 2014 11:46:57AM -1 points [-]

By "the basilisk", do you mean the infohazard, or do you mean the subject matter of the inforhazard? For the former, whatever causes you to not worry about it protects you from it.

Not quite true. There are more than two relevant agents in the game. The behaviour of the other humans can hurt you (and potentially make it useful for their creation to hurt you).