wedrifid comments on You're in Newcomb's Box - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (172)
It's not about the money this time - but the implications to utility are the same. The 'million dollars' in Newcomb's problem is allocated in the same way that life is allocated in this problem. In this problem the money is basically irrelevant because it is never part of Prometheus' decision. But existence in the world is part of the stakes.
The problem feels different to Newcomb's because the traditional problem was constructed to prompt the intuition 'but one boxers get the money!'. Then the intuition goes ahead and dredges up reasoning strategies (TDT for example) that are able to win the $1,000,000 rather than the the $1,000. But people's intuitions are notoriously baffled by anthropic like situations. No intuition "um, for some reason making the 'rational choice' is making me worse off" is prompted and so they merrily revert to CDT and fail.
Another way to look at that many people find helpful when considering standard Newcomb's it is that you don't know whether you are the actual person or the simulated person (or reasoning) that is occurring when Omega/Prometheus is allocating $1,000,000/life.
If consistent decision making strategy is applied for both Newcomb's and this problem then those who one box Newcomb's but two box in this problem are making the same intuitive mistake as those who think Quantum Suicide is a good idea based off MWI assumptions.
I didn't get it until I read this line:
So the question is: is Prometheus running this simulation? If so, he will create you only if you one-box.
So it's not that you were created by Prometheus, it's that you might currently be being created by Prometheus, in which case you want to get Prometheus to keep on creating you.
Or less specifically; if I enter into a situation which involves an acausal negotiation with my creator, I want to agree with my creator so as to be created. This type of decision is likely to increase my measure.
Due to my current beliefs about metaverses I would still two-box, but I now understand how different metaverse theories would lead me to one-box; because I assign a nontrivial chance that I will later be convinced of other theories, I'm wondering if a mixed strategy would be best... I don't really know.
Lest my words be a source of confusion note that I use 'simulation' as an example or 'proof of concept' for how the superintelligence may be doing the deciding. He may be using some other rule of inference that accurately models my decision making. But that doesn't matter to me.
I agree with you here I believe. I didn't mean to imply that Prometheus was literally running the simulation, just that phrasing it in this way made the whole thing "click" for me.
I think my phrasing is the potential source of confusion.