This is a variant built on Gary Drescher's xor problem for timeless decision theory.

You get an envelope from your good friend Alpha, and are about to open it, when Omega appears in a puff of logic.

Being completely trustworthy as usual (don't you just hate that?), he explains that Alpha flipped a coin (or looked at the parity of a sufficiently high digit of pi), to decide whether to put £1000 000 in your envelope, or put nothing.

He, Omega, knows what Alpha decided, has also predicted your own actions, and you know these facts. He hands you a £10 note and says:

"(I predicted that you will refuse this £10) if and only if (there is £1000 000 in Alpha's envelope)."

What to do?

EDIT: to clarify, Alpha will send you the envelope anyway, and Omega may choose to appear or not appear as he and his logic deem fit. Nor is Omega stating a mathematical theorem: that one can deduce from the first premise the truth of the second. He is using XNOR, but using 'if and only if' seems a more understandable formulation. You get to keep the envelope whatever happens, in case that wasn't clear.

New to LessWrong?

New Comment
94 comments, sorted by Click to highlight new comments since: Today at 11:49 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Take the £10, and don't bother opening the envelope. You are not (acausally) controlling whether £1'000'000 are in the envelope, but are controlling whether to take the £10, so you'll take the £10 (since you are money-maximizing), and if Omega is correct, the envelope is going to be empty.

The agents that refuse the £10 in this situation will only be visited by Omega when the envelope contains the £1'000'000, while the money-maximizing agents will only be visited by Omega when the envelope is empty. By your decision, you don't control whether the envelop contains money, but you do control whether Omega appears (since the statement asserted by Omega is about you). Thus, by deciding to take the money in this situation, you add expected £5 (or however often Omega appears) to your balance, by acausally summoning Omega.

By refusing the £10, you maximize the amount of money that the agents who see Omega get, by moving Omega around. It's similar to trying to become a lottery winner by selling to existing lottery winners the same dietary supplement you take, since this makes the takers of this dietary supplement more likely to be lottery winners.

I give formalization of this solution in another comment.

2Mitchell_Porter14y
Huh? Omega is there and says that if and only if you refuse will there be £1000 000 in the envelope. Aren't you turning down £1000 000 for £10?
0Vladimir_Nesov14y
Nope. I find my explanation pretty clear, can you point to what in particular you don't follow?
0Mitchell_Porter14y
I haven't worked through your formalization, but I do know that if I refuse, I get the £1000000! So I think something must be wrong with your implementation of the concept "money-maximizing".
2Vladimir_Nesov14y
This doesn't clarify the problem you are having.
0Mitchell_Porter14y
But you're the one having the problem! :-) ... I think. Omega, always right, says: "I predicted that you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope." So refusing the £10 is my only chance at the £1000000, and I actually have the envelope where the £1000000 may be. Unless it spontaneously combusts, or someone snatches it away, the larger sum should be mine.
3Jonii14y
Your choice doesn't change what's inside the envelope. Not even a-causally. Your choice only affects whether or not Omega comes and offers you £10 or not, and you maximize your expected value there by being the kinda guy who takes £10 that's offered. That way the 50% of time Alpha doesn't send you £1 000 000, you get £10. Otherwise those 50% time you wouldn't get anything.
0Mitchell_Porter14y
But with Vladimir's assumptions, he gets the £1000000 zero percent of the time! I quote: The description "money-maximizing" is wrong, but he is talking about a type of agent which does indeed make it impossible for Omega to ever show up while the £1000000 is there. To return to your own comment, correct wrong!
6Jonii14y
You're missing the fact that Alpha sending a letter happened regardless of Omega, and thus regardless of what you choose, you'd get £1 000 000 from Alpha 50% of time. You can't choose so that you'd get £1 000 000 zero percent of the time simply because your choice doesn't affect that. I repeat that, since that seems to be the key problem here. Alpha flipped a coin to decide whether or not to send you £1 000 000. Your past or future actions don't have any control over Alpha doing this, and sending you £1 000 000. In particular, your actions, upon receiving the envelope don't have any, direct or indirect, entanglement with what does the envelope contain. Your actions however are entangled with whether or not Omega comes along to offer you £10. If you're the kinda guy to accept the £10, Omega makes this deal only when Alpha didn't sent you £1 000 000. If you're the kinda guy that refuses £10, Omega comes only when Alpha sent you £1 000 000. So to maximize the expected value, you should accept the £10. That way, you get 50% time £1 000 000 and 50% £10. Otherwise you get 50% time £1 000 000 and 50% time £0
1Mitchell_Porter14y
Vladimir (and you!) get £1000000 zero percent of the time on those occasions when Omega appears, and by hypothesis this is one of those occasions! You are committing a higher-order version of the two-box mistake.
4Jonii14y
Exactly. Which is our purpose here. We want Omega to give £10 when we can accept it, not when we have to reject it. Which brings us back to my earlier statement: If you accept the £10, you get £10, and envelope will be empty. However, just as often(I'm assuming for simplicity that Omega appears always when possible) you receive envelope with £1 000 000 in it. If you refuse £10, you find that the envelope holds £1 000 000. However, just as often you receive empty envelopes. Your expected value here is £500 000, whereas by accepting your expected value would be £500 005. Your choice doesn't affect what the envelope holds. It will just as often hold £1 000 000 and be empty. Only thing you can affect here is when does the Omega appear. This is very much unlike the Newcombs problem, where your choice actually affects what the boxes contain. So effectively, only thing we do here is shift Omega-appearances to the times when we can accept the £10. Like I noted earlier, your choice has already caused Omega to appear, but it has not, and cannot, affect what the envelope contains. Edit: I should clarify that Omega appearing is a double conditional, if you won, you won regardless. If you lost, you lost regardless. For Omega to appear, your choice, given Omega appearing, has to be the right kind, and result of Alpha coin toss has to be the right kind. If you're the kinda guy to turn down the £10, for Omega to appear envelope has to contain the £1 000 000. Regardless of what you choose, you won anyway. This way however, if you didn't win, Omega wouldn't appear, offering you £10.
1Mitchell_Porter14y
The nature of my decision procedure affects the conditions under which Omega can appear. When I first confront this problem, I have not thought it through, but I know that Omega has appeared. So I ask: given that fact, what is the probability that the envelope contains the £1000000? Without any knowledge of what my decision procedure is, the probability that the envelope contains the £1000000 is .5. If I am a determined £10-taker, then the probability that the envelope contains the £1000000 is zero. If I am a determined £10-refuser, then the probability that the envelope contains the £1000000 is one. But I am neither of those things. I am some more complicated decision-making system which is capable of either taking or refusing the £10, depending on which act is to my advantage. And I can see that if I refuse the £10, then there must be £1000000 in the envelope, which I get to keep. So, I refuse the £10. Now it might be argued that I just got lucky. If I was as rational as you and Vladimir, then Omega would only ever appear when there was no money in the envelope. But because I hadn't thought things through, it is possible for Omega to show up when there is money in the envelope, and in that case the right thing to do is what I did. Basically, if you are already an entity which has reflectively optimized its decision procedure for Alpha-Omega situations, then you and Vladimir are making the right choice. But I was not such an entity, and so my choice was the right one for me.
3Jonii14y
Actually, not. Like I said, your choice there doesn't affect what the envelope contains. If you were rational like me and Vladimir, you wouldn't meet Omega. You'd just receive an envelope with £1 000 000 in it. Funny thing with this envelope-puzzle is that Omega makes refursers and accepters to live in different conditionals. If you end up answering "refuse", you're in the conditional "Alpha decided to send you money". If you answer "accept", you're in the conditional "Alpha decided not to send you money". However, your choice doesn't have any power over these conditionals, regardless of what you'd choose, Alpha's coin toss wouldn't be affected. And because your choice doesn't affect what the envelope contains, you're not actually winning anything by refusing £10. Your refusal is simply a-causally making Omega appear in front of you after you got £1 000 000 from Alpha. Just like it is making a-causally Omega appear in front of me and Vladimir after we didn't get anything. It doesn't say anything about our chances to win £1 000 000, which were 50-50. And like I noted earlier, because of this, occasionally we receive enveloped that hold 1 000 000, while you occasionally receive empty envelopes.
1FAWS14y
No. If you knowably refuse the £10 in this situation that makes you a determined £10-refuser. The fact that you personally did not know that you are a determined £10-refuser even though Omega did does not have any magical consequences. Basically you can't simultaneously take the fact that you have a choice and the fact that Omega is actually standing before you as given.
2FAWS14y
Apparently someone thinks there is something wrong with this. Could they please explain?
0Cyan14y
Click. Thanks!
-4Tiiba14y
So Omega said, if you accept the ten pounds, I predict that Alpha gave you a bag of air. You accept the ten, and it turns out that Alpha still sends you ten million. So Omega is wrong. But Omega is never wrong. But he is. But he can't be. But he is! No.
1Nick_Tarleton14y
If Alpha sends you 10m and you would accept the ten, Omega doesn't make the stated prediction.
-1Tiiba14y
But he did. He's in front of you. You're the winner. And you're going to tell him that you'd rather have ten pounds. I understand that it's more profitable to mop bathrooms in a public school than to buy lottery tickets, but if somebody tells you, "if you turn down this job, I will give you a winning ticket," don't go to work.
1Vladimir_Nesov14y
Omega only appears conditionally on at least the statement it asserts being correct. By taking/not taking its offer, you are only controlling the conditions under which Omega appears, and not contents of the envelope. By refusing the £10, you make sure that Omega appears only when the envelope is full (but you don't make the envelope full, though it's going to be full given that you've made this decision), and by accepting the £10, you make sure that Omega appears only when the envelope is empty. It's admittedly confusing that you can (acausally) control the conditions under which Omega appears (when the envelope is full/empty), when Omega remains right in front of you during the decision-making (this is analogous to controlling the contents of the big box in Newcomb's problem) but at the same time, you don't control the contents of the envelope.
0Mitchell_Porter14y
And by assuming you are a certain sort of agent (which you incorrectly call money-maximizing), you set those conditions to your own disadvantage! An agent which just flips a coin to decide whether to accept or refuse the £10 will have a bigger expected payoff than you. So surely a rational entity can do better.
5Vladimir_Nesov14y
You are setting the conditions for appearance of Omega. The best conditions for Omega to appear are those where you take its money, since it's good for nothing else. By refusing the £10, you maximize the amount of money that the agents who see Omega get, by moving Omega around. It's similar to trying to become a lottery winner by selling to existing lottery winners the same dietary supplement you take, since this makes the takers of this dietary supplement more likely to be lottery winners. (Added this paragraph to the top-level comment.)
5Mitchell_Porter14y
I'm not 100% sure but it seems like you and Jonii are calculating correctly. It's just ironic that if the situation as described happens to you, it means you were unlucky and there's no money in Alpha's envelope, whereas if it happens to someone like me, it means I was lucky and the £1000000 is there.
0RobinZ14y
So you refuse the £10?
0Vladimir_Nesov14y
No, I don't. Why?
1RobinZ14y
I'm sorry - I was confused when I wrote that comment.
0timtyler14y
Re: "The agents that refuse the £10 in this situation will only be visited by Omega when the envelope contains the £1'000'000" That sounds good to me!
4Vladimir_Nesov14y
Not good. All you've achieved is redirected Omega to situations in which you don't take its money. It's better to have Omega where you do take its money, it's free money.
0RobinZ14y
For some reason, this reply specifically cemented the argument for me. Thank you - I now agree with you. Edit: If it helps, my confusion was the appearance of causation from I-refuse-the-£10 to I-receive-the-£1e6. When you made this comment, I mentally went back and saw that the fraction of possible worlds in which Alpha gives me the million is unchanged by Omega's prediction, and therefore that I can take the tenner without affecting it.
0timtyler14y
Right - and finally I am there as well :-)
0FAWS14y
If there was a 50% chance Omega in the future visits someone who would refuse to take the £10 and gives them £1'000'000, and a 50% chance Omega visits someone who would accept the £10 and gives them £10 and an empty envelope, what would you prefer? Depending how you would behave if Omega visited you the probability of either the first or the second person being you is zero.
0Vladimir_Nesov14y
Refuse obviously. You've described how my choice controls the payoff, which is not the case with Alpha.
0FAWS14y
Would you still get the envelope if Omega wasn't going to visit you? I had automatically assumed that Omega initiated the whole situation because the title said that Omega was subcontracting, but I see that the body doesn't actually state that.
0Stuart_Armstrong14y
Edited to make this clear
0Vladimir_Nesov14y
Yes, this seems to be assumed, though it didn't actually happen this way, Omega did visit you.
0FAWS14y
If that's the scenario and if the only method Omega uses to ensure its prediction is accurate is selective visits your conclusion is obviously correct. I doubt there is anyone here who (correctly?) understood it that way and disagrees.
1Vladimir_Nesov14y
The main problem with such thought experiments is understanding them correctly (or better, having your formal decision theory represent them correctly), from where the conclusion usually follows trivially. Just try convincing a game theorist to cooperate in Prisoner's dilemma, even experimental observations contradicting the theory of rational defection won't help.
[-]ata14y140

I'll disregard my earlier comment and assume the latter interpretation for now.

So here are the things that can (and can't) happen:

  1. Alpha puts £1 000 000 in the envelope.
    1. Omega has predicted you will refuse the £10 and that there is £1 000 000 in the envelope.
      1. You refuse the £10, and find £1 000 000 in the envelope.
      2. You accept the £10 — contradiction, won't happen.
    2. Omega has predicted you will accept the £10 and that the envelope is empty — contradiction, won't happen.
  2. Alpha puts nothing in the envelope.
    1. Omega has predicted you will refuse the £10 and that there is £1 000 000 in the envelope — contradiction, won't happen.
    2. Omega has predicted you will accept the £10 and that the envelope is empty.
      1. You refuse the £10 — contradiction, won't happen.
      2. You accept the £10, and find nothing in the envelope.

So, starting with Alpha's coin flip, here are the only possible paths:

  1. Alpha puts £1 000 000 in the envelope. Omega has predicted you will refuse the £10 and that there is £1 000 000 in the envelope. You refuse the £10, and find £1 000 000 in the envelope.
  2. Alpha puts nothing in the envelope. Omega has predicted you will accept the £10 and that the envelope is empty. You accept
... (read more)

This is really cool puzzle. By accepting the £10, you're in a conditional "Alpha never sent you the money", but by refusing you're in conditional "Alpha sent you the money". However, that choice doesn't actually affect Alpha sending or not sending you the money. This is unlike the Newcomb's problem, where you can truly choose, acausally, what the opaque box will contain.

2Mitchell_Porter14y
What gets me is the peculiarly elaborate pitfall into which I, at least, fell. Suppose you said: "Invent a thought-experiment which could trick people who know to one-box in the classic Newcomb's paradox, into thinking that here was a higher-order analogue; the source of the error to be, that people who reason wrongly do experience a higher payoff in this case." Perhaps it should be called Armstrong's trap. But did he make it by design, or did he just fall into it first?
3Stuart_Armstrong14y
It's all built on Drescher's version, just stripped down. And I didn't fall into Drescher's trap: I incorrectly stated the correct answer, then thought about it really hard and really long, and correctly stated the correct answer.

I assume the problem is to be interpreted as Omega saying, "Either (1) (I have predicted you will refuse the $10, and there is $1000,000 in the envelope) xor (2) (I have predicted you will take the $10, and there is $0 in the envelope)", rather than asserting some sort of entanglement above and beyond this.

If so, I take the $10 and formulate the counterfactual, "If I were the sort of person who rejected the $10, Omega would have told me something else to begin with, like 'if you refuse the $10 then the envelope will be empty', but the digit of pi would have been the same".

As previously noted, though, I can't quite say how to compute this formally.

2[anonymous]14y
This.
1FAWS14y
I assume you would consider "You will take this $10 if and only if Barack Obama is president of the United States." true even if you were completely certain you would take the $10 if John McCain was President. If and only if this was the intended meaning I would agree with your conclusion.
0timtyler14y
Re: "If I were the sort of person who rejected the $10, Omega would have told me something else to begin with" ...but why would he do that? Is there some assumption about Omega's motivation here?
0ata14y
It's correct if we expand it to "Omega would have told me something else or not shown up to begin with", or if we're assuming that Omega will show up and say something. It would have to say something like "if you refuse the $10 then the envelope will be empty" — or some other true thing, not the statement given in the original post — since we're assuming it's a perfect predictor and is being honest.
0timtyler14y
Omega can say: "I have predicted you will refuse the $10, and there is $1000,000 in the envelope". There is absolutely no problem with that - if you are a refuser (as specified in the hypothetical) and if the envelope does indeed contain $1000,000. True, he would have to say something else, in the case where the envelope is empty.
0ata14y
Ah, yes, you're right. Now I'm not sure if I was correctly interpreting Eliezer's point or just restating my own.
0[anonymous]14y
Me too. (Anyone want to try to express this using a world-program?)
[-]FAWS14y60

I think this problem would be clearer with a smaller ratio between the two payments. As it is the risk that you might have misunderstood the problem or made an unwarranted assumption dominates and you should not take the £10 just to be safe you aren't making a big mistake, even if you think that's a losing move.

0Stuart_Armstrong14y
The large ratio is deliberate (and it's not so huge that 'all my theories are wrong!' is going to dominate).
0FAWS14y
The problem as stated is easy to misunderstand. I personally misunderstood (or "under-understood") it in at least three separate ways: 1. I considered the causal relation between Omega visiting me making that particular prediction and Alpha choosing me as potential receipant an unknown. 2. I considered what sort of predictions Omega would make in various counterfactuals an unknown. 3. I considered the truth value of "I predicted that you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope." conditional on me always accepting the money if given a chance and the envelope being empty an unknown. Even now that my current understanding seems to have have been indirectly confirmed by you my confidence that this understanding is correct is only about 0.95. Even if you were to confirm that I currently understand it correctly in a more direct way I doubt it would raise my confidence above 0.999. Unless the scenario was presented in a way that raised my confidence significantly higher (for example Omega stating: "this situation is in all relevant ways identical to how you eventually came to understand the "Omega's subcontracting to Alpha" scenario presented by Stuart_Armstrong) I'd still refuse the £10.
0Stuart_Armstrong14y
1. Alpha has sent me the envelope, and would do so whatever Omega decided to do. The causal decision as to why Omega visited me is irrelevant. 2. This is irrelevant. 3. "I predicted that you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope." is true. To avoid ambiguity, recast is as: XNOR("I predicted you will refuse this £10", "there is £1000 000 in Alpha's envelope") is true. As for the large ratio: Omega snatches the £10 away from you, swallows his words, runs out and returns a bit later with a check for £100 000. "Out of deference to your uncertainties", he says, sighing, "I've decided to renew the experiment with a lesser ratio. But just this once!"
0FAWS14y
No, it's not. If, conditional on me always rejecting the £10 when Omega makes this specific prediction, Omega would visit when the envelope was empty, offer £10 and make the different prediction that I'd take it (the assumption being that I wouldn't refuse it without reason so Omega can't make the true prediction that I'd do so), or if, conditional on me always taking the £10 when Omega makes this specific prediction, Omega would visit when the envelope was full, offer £10 and make the different prediction that I'd take it that would change the payoff. If only the first was true that would make the scenarios equivalent. I take it of course.

This is formalization of the decision procedure corresponding to the informal solution I gave in another comment (obviously, it includes a lot of detail unnecessary for this problem, but for the purpose of demonstrating the method, details are not omitted):

Programs for the participants:
P - player
O - Omega deciding whether to make the offer
A - Alpha

Notation: [[X]] is the output of program X, X(Y) is a program that is composition of X and Y, where X expects program Y as argument. Thus, [[X(Y)]] is the output of X given argument Y, and X([[Y]]) is the output ... (read more)

2Stuart_Armstrong14y
I really like this formulation.

Refuse the 10 pounds.

The assumptions that you'll move Omega around or otherwise alter Omega's pattern of behavior seems speculative. Maybe Omega's going fishing for a few hundred years. Maybe she's feeling frisky and generous. Maybe I got the problem wrong.

It appears there's some chance that I'm improving my chance at a million pounds by some amount. Those "somes" may not be high, but my problem-uncertainty makes it an easy call. I see no reason to expect a lower or higher number of Omega appearances based on my decision. To the extent this migh... (read more)

What Eliezer and Vladimir said (though if anyone's counting, I decided this before looking at the comments). My choice controls whether or not Omega made its prediction, not the contents of the envelope. (How would one express this using a world-program?)

0Wei Dai14y
def P(): envelope_full = pi_parity(10^6) refuse = Omega_Predict(S, "I predicted that you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope.") if envelope_full: u += 1000000 if refuse == envelope_full: if not S("I predicted that you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope."): u += 10
0Stuart_Armstrong14y
Yep, this seems correct. For some reason, Vladimir's formulation seems clearer to me. Must be my math background.

I humbly request that future thought experiments not be done in £, since there is no "£" key on my keyboard.

3RobinZ14y
Opt+3 for me as well (Mac); Alt+0163 for Windows.
0thomblake14y
If I was going to do that much, I'd just copy-and-paste from the article. It's still terribly inconvenient.
1mattnewport14y
I'd suggest using the common three letter abbreviations for currencies (as found here for example), it avoids confusion between the various currencies known as dollars and avoids problems with symbol unavailability. For £s that would be GBP.
2Stuart_Armstrong14y
All I have to say is: ¥, €, ৳, ₪, ریال and zł
0thomblake14y
Right, those neither. How about utilon sandwiches?
2Alicorn14y
Option-3 types £ for me.
7thomblake14y
You and your made-up keys
[-]FAWS14y20

I'd refuse the £10 unless I was extremely confident (>99.999%) that if I took the £10 I couldn't actually exist because the scenario as given was inconsistent and that the real me would end up with £10 more if this was true.

(i. e. the offer was independent of my decision but the prediction not, Omega would take exactly the same action regardless of whether the envelope was filled, the prediction would be false if I took the £10 in either case, and I would take the £10 in either case)

I don't understand the theory, but the one-boxing solution seems obvious: given that Omega is correct, if I am such that I would refuse the £10, I would not be offered the choice unless the £1 000 000 is in the envelope, therefore I should refuse the £10 ...

... unless I believe Omega is over u(£1 000 000)/u(£10) times more likely to offer the deal to agents who take the £10 than to agents who refuse. In that case, being willing to take the £10 is expected to pay off.

Edit (after timtyler's reply): Vladimir Nesov's analysis has caused me to reconsider - I would now take the £10.

0timtyler14y
That seems like a reasonable analysis to me - assuming that you get to keep the contents of the envelope. So: the solution depends on information about Omega's motivation not included in the problem description. Time to consult those mythology textbooks, methinks - so we have appropriate priors :-/ [edit: scratch this - I get it now!]

Isn't this just a reformulation of Newcomb's problem ?

Mechanically, "Omega + alpha + the random generator" is equivalent to Newcomb's Omega.

[Edit: OK, it isn't :)]

3Vladimir_Nesov14y
The difference is that Alpha is generating the contents of the envelope independently on your decision, while in Newcomb's problem Omega is placing money in the box under the direct (acausal) control of your decision.
0Emile14y
I guess you're right - especially considering the level of disagreement in the other comments. The fact that I can't exactly pin down at what point I disagree with those who say they take the £10 indicates that I don't understand the problem enough (I may not understand enough about Newcomb's problem either).
[-][anonymous]11y00

Take the £10, my reasoning goes as follows: if I precommit to refuse it, either I get the £1,000,000 and refuse £10, or I get £0 and omega doesn't even show up; if I precommit to accept it, either I get the £1,000,000 and omega doesn't even show up, or I get £10 from omega showing up and me accepting (the respective expected utilities being £500,000 and £500,005). I do better by precommitting to take it, so to be reflectively consistent (and win), I must now take it.

I like this problem because it seems to operate on the same intuitions that lead to one-boxing and two-boxing for those who don't do any actual analysis, but the one-boxing intuition leads you astray (though not by much).

Personally, I'd take the £10 on reflection but would have refused the £10 based on my intuitions. I'm pretty sure Omega wouldn't be giving me £10, since if confronted with the situation I would be forced to think, "If I say 'no' now, there's lots of money in that envelope."

The answer is dependong on what Omega would have done if he had predicted that you will refuse the 10 iff there is nothing in Alpha's envelope. Two possibilities :

  • Omega1 would have brought you the envelope anyway, but said nothing else

  • Omega2 wouldn't have bothered to come, since there's no paradox involved.

When dealing with Omega1, take the £10, yay, free money ! (there wasn't anything in the envelope anyway, otherwise Omega wouldn't have visited you, the taker-of-free-money - see Vladimir's explanation)

The post as stated doesn't tell us which Omega... (read more)

2Stuart_Armstrong14y
Omega didn't bring you the envelope. It arrived before he got there.
[-][anonymous]14y00

I'll disregard my earlier comment and assume the latter interpretation for now.

So here are the things that can (and can't) happen:

  1. Alpha puts £1 000 000 in the envelope.
    1. Omega has predicted you will refuse the £10 and that there is £1 000 000 in the envelope.
      1. You refuse the £10, and find £1 000 000 in the envelope.
      2. You accept the £10 — contradiction, won't happen.
    2. Omega has predicted you will accept the £10 and that the envelope is empty — contradiction, won't happen.
  2. Alpha puts nothing in the envelope.
    1. Omega has predicted you will refuse the £10 and
... (read more)

Hmm. Some commentators appear to be assuming that you don't get to keep the contents of the envelope which Alpha sent you. The problem is not 100% clear on this issue - and it makes a difference to the answer!

2ata14y
As it says "You get an envelope from your good friend Alpha," I'd assume by default that you get to keep it, unless there were an explicit statement that Omega might steal it from you under some circumstance.

I one-box on Newcomb's. I two-envelope on this. This situation, however, is absurd. [ETA: Now that I think about it more, I'm now inclined to one-envelope and also more irritated by the hidden assumptions in this whole hypothetical.]

Omega's prediction is bizarre, because there's no apparent way that the contents of the envelope are entangled with my decision to accept the money - whether I am the kind of person who two-boxes or one-boxes, the contents of the envelope were decided by a coin toss. It seems the only way for Omega to make a reliable prediction... (read more)

6thomblake14y
(replying to new version of comment) Yes, Omega could easily only offer the deal to those for whom his prediction is true.
1Jack14y
Am I right that if the money is in the envelope Omega only offers to one-boxers and if the money is not in the envelope Omega only offers to two-boxers?
2thomblake14y
That's the way I read it. (after some analysis)
0thomblake14y
Indeed, but it could be different enough to count as an "exercise" to those interested in doing the causal analysis for themselves. ETA: responded to an earlier version of the comment in which Psychohistorian claimed this was the same as Newcomb
0Psychohistorian14y
On further analysis, I actually think it is totally different; I believe you responded to an earlier draft of my previous comment in which I said they were basically the same. Lest people get confused.
[-]ata14y00

How are we to read Omega's statement?

if and only if there is £1000 000 in Alpha's envelope.

Or:

I predicted that <you will refuse this £10 if and only if there is £1000 000 in Alpha's envelope>.

The former interpretation leaves open the possibility that, if there is £1000 000 in the envelope, Omega made no prediction one way or the other.

0Jonii14y
Let's see... This seems natural way to do it. However, if you're the type that refuses, Omega can't be making this deal when you didn't receive £M. Also, if you accept, Omega can't be making this deal if you really won. However, there really isn't anything that prevents that a) and and b) and from being true, because your choice cannot determine the outcome of the cointoss Alpha made. Thus, you should accept. This would be weird. For Omega to make a claim like this, your choice has to be somehow connected to the outcome of the coinflip Alpha made before sending you the envelope. This is because Omega is making the prediction conditional only to the outcome of the coin toss. Your choice is simply assumed to be entangled with that.

I would translate this scenario into the following world-program:

U(S) =
{
    envelopeIsFilled = coinflip()
    acceptNote = S()

    if (acceptNote == envelopeIsFilled)
        CONTRADICTION
    else
        return (envelopeIsFilled ? 1e6 : 0) + (acceptNote ? 10 : 0)
}

Based on this world-program, it is obvious that you should refuse the note.

5Vladimir_Nesov14y
You didn't take into account that Omega appears conditionally on contents of the envelope and your decision.
0thomblake14y
Heh. I first read "1e6" above as a function determining whether the user-agent is internet explorer 6. What is "CONTRADICTION" supposed to do in this "program"?
0jimrandomh14y
This program must be embedded in a larger one, since the original problem description didn't say what Omega would do if it couldn't truthfully make the prediction it did. Call that larger program U2(S). The only thing we are told about U2 is that it only calls U if it can do so in a way which guarantees that U won't reach a contradiction. Suppose, for example, that if Omega's prediction couldn't be made truthfully then you wouldn't get any money at all. This corresponds to the world program: U2(S) = { try { return U(S) } catch(contradiction) { return 0 } } Note that there are plenty of mathematically equivalent ways to write this - for example, using a would_throw(U,S,RNG) function.
[-][anonymous]14y00

This is just Newcomb's problem with a coin flipped on how the boxes are labeled, so of course I onebox.