Eliezer_Yudkowsky comments on My Fundamental Question About Omega - Less Wrong

6 Post author: MrHen 10 February 2010 05:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (151)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrHen 10 February 2010 08:30:57PM 5 points [-]

I don't know how to respond to this or Morendil's second comment. I feel like I am missing something obvious to everyone else but when I read explanations I feel like they are talking about a completely unrelated topic.

Things like this:

You seem to be confused about free will. Keep reading the Sequences and you won't be.

Confuse me because as far as I can tell, this has nothing to do with free will. I don't care about free will. I care about what happens when a perfect predictor enters the room.

Is such a thing just completely impossible? I wouldn't have expected the answer to this to be Yes.

If you do know what the prediction is, then the way in which you react to that prediction determines which prediction you'll hear. For example, if I walk up to someone and say, "I'm good at predicting people in simple problems, I'm truthful, and I predict you'll give me $5," they won't give me anything. Since I know this, I won't make that prediction. If people did decide to give me $5 in this sort of situation, I might well go around making such predictions.

Okay, yeah, so restrict yourself only to the situations where people will give you the $5 even though you told them the prediction. This is a good example of my frustration. I feel like your response is completely irrelevant. Experience tells me this is highly unlikely. So what am I missing? Some key component to free will? A bad definition of "perfect predictor"? What?

To me the scenario seems to be as simple as: If Omega predicts X, X will happen. If X wouldn't have happened, Omega wouldn't predict X.

I don't see how including "knowledge of the prediction" into X makes any difference. I don't see how whatever definition of free will you are using makes any difference.

"Go read the Sequences" is fair enough, but I wouldn't mind a hint as to what I am supposed to be looking for. "Free will" doesn't satiate my curiosity. Can you at least tell me why Free Will matters here? Is it something as simple as, "You cannot predict past a free will choice?"

As it is right now, I haven't learned anything other than, "You're wrong."

Comment author: Eliezer_Yudkowsky 10 February 2010 08:49:52PM 1 point [-]

To me the scenario seems to be as simple as: If Omega predicts X, X will happen. If X wouldn't have happened, Omega wouldn't predict X.

Sounds like you might be having confusion resulting from circular mental causal models. You've got an arrow from Omega to X. Wrong direction. You want to reason, "If X is likely to happen, Omega will predict X."

Comment author: Cyan 10 February 2010 09:17:32PM 6 points [-]

I believe the text you quote is intended to be interpreted as material implication, not causal arrows.

Comment author: MrHen 10 February 2010 09:08:05PM *  3 points [-]

Sure. So, X implies that Omega will predict X. The four possible states of the universe:

Where
X is "You will give Omega $5 if Y happens" and
Y is "Omega appears, tells you it predicted X, and asks you for $5":

1) X is true; Omega does Y
2) X is false; Omega does Y
3) X is true; Omega does not do Y
4) X is false; Omega does not do Y

Number two will not happen because Omega will not predict X when X is false. Omega doesn't even appear in options 3 and 4, so they aren't relevant. The last remaining option is:

X is true; Omega does Y. Filling it out:

X is "You will give Omega $5 if Omega appears, tells you it predicted X, and asks you for $5."

Hmm... that is interesting. X includes a reference to X, which isn't a problem in language, but could be a problem with the math. The problem is not as simple as putting "you will give Omega $5" in for X because that isn't strictly what Omega is asking.

The easiest simplification is to take out the part about Omega telling you it predicted X... but that is a significant change that I consider it a different puzzle entirely.

Is this your objection?

Comment author: pengvado 11 February 2010 12:17:06AM *  3 points [-]

X is "You will give Omega $5 if Omega appears, tells you it predicted X, and asks you for $5."

That is an interesting math problem. And the math problem has an solution, which is called a quine. So the self-referentialness of the prediction is not by itself a sufficient objection to your scenario.

Comment author: MrHen 11 February 2010 12:21:28AM 1 point [-]

Nice, thanks.