datadataeverywhere comments on You're in Newcomb's Box - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (172)
Either I don't get it, or you are misapplying a cached thought. Please explain to me where my reasoning is wrong (or perhaps where I misunderstand the problem):
When answering Newcomb's problem, we believe Omega is a reliable predictor of what we will do, and based on that prediction places money accordingly.
In this problem, Prometheus always believes (by construction!) that we will one-box, and so will always place money according to that belief. In that case, the allocation of money will be the same for people who one-box (most people, since Prometheus is a good predictor), and the people who two-box.
You could make an alternate argument that even if you want to two-box, Prometheus' near infallibility means you are unlikely to (after all, if everyone did, he would be a terrible predictor), but that's different than answering what you should do in this situation.
It's not about the money this time - but the implications to utility are the same. The 'million dollars' in Newcomb's problem is allocated in the same way that life is allocated in this problem. In this problem the money is basically irrelevant because it is never part of Prometheus' decision. But existence in the world is part of the stakes.
The problem feels different to Newcomb's because the traditional problem was constructed to prompt the intuition 'but one boxers get the money!'. Then the intuition goes ahead and dredges up reasoning strategies (TDT for example) that are able to win the $1,000,000 rather than the the $1,000. But people's intuitions are notoriously baffled by anthropic like situations. No intuition "um, for some reason making the 'rational choice' is making me worse off" is prompted and so they merrily revert to CDT and fail.
Another way to look at that many people find helpful when considering standard Newcomb's it is that you don't know whether you are the actual person or the simulated person (or reasoning) that is occurring when Omega/Prometheus is allocating $1,000,000/life.
If consistent decision making strategy is applied for both Newcomb's and this problem then those who one box Newcomb's but two box in this problem are making the same intuitive mistake as those who think Quantum Suicide is a good idea based off MWI assumptions.
Well, I definitely am confused. What utility are you gaining or losing?
Is this an issue about your belief that you are created by Prometheus? Is this an issue about your belief in Omega or Prometheus' honesty? I'm very unclear what I can possibly stand to gain or lose by being in a universe where Prometheus is wrong versus one where he is right.