You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Desrtopa comments on Can anyone explain to me why CDT two-boxes? - Less Wrong Discussion

-12 Post author: Andreas_Giger 02 July 2012 06:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (136)

You are viewing a single comment's thread.

Comment author: Desrtopa 04 July 2012 02:10:47PM *  6 points [-]

Omniscient Omega doesn't entail backwards causality, it only entails omniscience. If Omega can extrapolate how you would choose boxes from complete information about your present, you're not going to fool it no matter how many times you play the game.

Imagine a machine that sorts red balls from green balls. If you put in a red ball, it spits it out Terminal A, and if you put in a green ball it spits it out Terminal B. If you showed a completely colorblind person how you could predict in which terminal a ball would get spit out of before putting it into the machine, it might look to them like backwards causality, but only forwards causality is involved.

If you know that Omega can predict your actions, you should condition your decisions on the knowledge that Omega will have predicted you correctly.

Humans are predictable enough in real life to make this sort of reasoning salient. For instance, I have a friend who, when I ask her questions such as "you know what happened to me?" or "You know what I think is pretty cool?" or any similarly open ended question, will answer "Monkeys?" as a complete non sequitur, more often than not (it's functionally her way of saying "no, go on.") However, sometimes she will not say this, and instead say something like "No, what?" A number of times, I have contrived situations where the correct answer is "monkeys," but only asked the question when I predicted that she would not say "monkeys." So far, I have predicted correctly every time; she has never correctly guessed "monkeys."

Comment author: Andreas_Giger 04 July 2012 02:39:12PM *  0 points [-]

Omniscient Omega doesn't entail backwards causality, it only entails omniscience. If Omega can extrapolate how you would choose boxes from complete information about your present, you're not going to fool it no matter how many times you play the game.

I agree if you say that a more accurate statement would have been "omniscient Omega entails either backwards causality or the absence of free will."

I actually assign a rather high probability to free will not existing; however discussing decision theory under that assumption is not interesting at all.

Regardless of the issue of free will (which I don't want to discuss because it is obviously getting us nowhere), if Omega makes its prediction solely based on your past, then your past suddenly becomes an inherent part of the problem. This means that two-boxing-You either has a different past than one-boxing-You and therefore plays a different game, or that Omega makes the same prediction for both versions of you, in which case two-boxing-You wins.

Comment author: Desrtopa 04 July 2012 03:04:41PM *  1 point [-]

Two-boxing-you is a different you than one-boxing-you. They make different decisions in the same scenario, so something about them must not be the same.

Omega doesn't make its decision solely based on your past, it makes the decision based on all information salient to the question. Omega is an omniscient perfect reasoner. If there's anything that will affect your decision, Omega knows about it.

If you know that Omega will correctly predict your actions, then you can draw a decision tree which crosses off the outcomes "I choose to two box and both boxes contain money," and "I choose to one box and the other box contains no money," because you can rule out any outcome that entails Omega having mispredicted you.

Probability is in the mind. The reality is that either one or both boxes already contain money, and you are already going to choose one box or both, in accordance with Omega's prediction. Your role is to run through the algorithm to determine what is the best choice given what you know. And given what you know, one boxing has higher expected returns than two boxing.