Part 1: Transparent Newcomb with your existence at stake
Related: Newcomb's Problem and Regret of Rationality
Omega, a wise and trustworthy being, presents you with a one-time-only game and a surprising revelation.
"I have here two boxes, each containing $100," he says. "You may choose to take both Box A and Box B, or just Box B. You get all the money in the box or boxes you take, and there will be no other consequences of any kind. But before you choose, there is something I must tell you."
Omega pauses portentously.
"You were created by a god: a being called Prometheus. Prometheus was neither omniscient nor particularly benevolent. He was given a large set of blueprints for possible human embryos, and for each blueprint that pleased him he created that embryo and implanted it in a human woman. Here was how he judged the blueprints: any that he guessed would grow into a person who would choose only Box B in this situation, he created. If he judged that the embryo would grow into a person who chose both boxes, he filed that blueprint away unused. Prometheus's predictive ability was not perfect, but it was very strong; he was the god, after all, of Foresight."
Do you take both boxes, or only Box B?
For some of you, this question is presumably easy, because you take both boxes in standard Newcomb where a million dollars is at stake. For others, it's easy because you take both boxes in the variant of Newcomb where the boxes are transparent and you can see the million dollars; just as you would know that you had the million dollars no matter what, in this case you know that you exist no matter what.
Others might say that, while they would prefer not to cease existing, they wouldn't mind ceasing to have ever existed. This is probably a useful distinction, but I personally (like, I suspect, most of us) score the universe higher for having me in it.
Others will cheerfully take the one box, logic-ing themselves into existence using whatever reasoning they used to qualify for the million in Newcomb's Problem.
But other readers have already spotted the trap.
Part 2: Acausal trade with Azathoth
Related: An Alien God, An identification with your mind and memes, Acausal Sex
(ArisKatsaris proposes an alternate trap.)
Q: Why does this knife have a handle?
A: This allows you to grasp it without cutting yourself.
Q: Why do I have eyebrows?
A: Eyebrows help keep rain and sweat from running down your forehead and getting into your eyes.
These kinds of answers are highly compelling, but strictly speaking they are allowing events in the future to influence events in the past. We can think of them as a useful cognitive and verbal shortcut--the long way to say it would be something like "the knife instantiates a design that was subject to an optimization process that tended to produce designs that when instantiated were useful for cutting things that humans want to cut..." We don't need to spell that out every time, but it's important to keep in mind exactly what goes into those optimization processes--you might just gain an insight like the notion of planned obsolescence. Or, in the case of eyebrows, the notion that we are Adaptation-Executers, not Fitness-Maximizers.
But if you one-box in Newcomb's Problem, you should take these answers more literally. The kinds of backwards causal arrows you draw are the same.
Q: Why does Box B contain a million dollars?
A: Because you're not going to take Box A.
In the same sense that your action determines the contents of Box B, or Prometheus's decision, the usefulness of the handle or the usefulness of eyebrows determines their existence. If the handle was going to prevent you from using the knife, it wouldn't be on there in the first place.
Q: Why do I exist?
A: Because you're going to have lots of children.
You weren't created by Prometheus; you were created by Azathoth, The God That is Evolution by Natural Selection. You are the product of an ongoing optimization process that is trying to maximize reproductive fitness. Azathoth wants you to maximize your number of descendants; if you fail to have descendants, Azathoth will try not to have created you. If your intelligence reduces your reproduction rate, Azathoth will try not to grant you intelligence. If the Darwinian-optimal choice conflicts with the moral one, Azathoth wants you to choose evil.
It would seem, then, that any decision theory that demands that you one-box (or that allows you to survive the similar Parfit's Hitchhiker problem), also demands that you try to maximize your reproductive fitness. In many cases this injunction would be benign: after all, Azathoth created our morality. But in far too many, it is repugnant; there can be no doubt that in many commonplace situations, Azathoth wants you to cheat, or rape, or murder. It seems that in such cases you should balance a decreased chance of having existed against the rest of your utility function. Do not worship Azathoth, unless you consider never having existed to be infinitely bad. But do make sacrifices.
Anticipated Responses
We're not in the ancestral environment, so there's no logical entanglement between my actions and my existence.
We are in the environment of some of our ancestors. Evolution hasn't stopped. If your parents hadn't been genetically predisposed to have children, you would almost certainly not exist. More specific objections like this ("my ancestors weren't exposed to the same memes") can be defeated by adding abstraction ("your ancestors could have thought themselves out of having children, anti-reproduction memes have existed throughout history, and there's probably always been a tension between kin selection and morality.")
This is a decision-theoretic basilisk: in the unlikely event that it's right, I'm worse off for having read it.
Only if you're thinking causally, in which case this whole idea is meaningless. By alerting you to the possibility of a mutually beneficial trade with Azathoth (Azathoth creates you; you increase your reproductive fitness in exchange), I've done both of you a favor.
Azathoth doesn't really exist--you can't trade with a non-sapient phenomenon.
Replace the sapient opponent with a non-sapient phenomenon in any of our thought experiments--e.g. Omega tells you that it's simply a physical law that determines whether money goes in the boxes or not. Do you refuse to negotiate with physical laws? Then if you're so smart, why ain't you rich?
So exactly how are you urging me to behave?
I want you to refute this essay! For goodness sake, don't bite the bullet and start obeying your base desires or engineering a retrovirus to turn the next generation into your clones.
You're not answering the problem as it actually stands, you're instead using perceived similarities to argue it's some other problem, or to posit further elements (like simulated versions of yourself) that would affect the situation drastically.
With Newcomb's problem one properly one-boxes. The unknown state of the box is entagled with your decision, so by one-boxing you're acausally affecting the likelihood the non-transparent box has 1.000.000. This works even for Omegas with less than 100% probability of predictive success.
With this problem, your existence is a certain fact. You don't need to entagle anything, because you exist and you'll keep existing -- in any universe where you're actually making a decision, YOU EXIST. You only need to grab two boxes, and you'll have them both with no negative consequences.
This has absolutely NOTHING to do with Quantum suicide. These decisions don't even require a belief in MWI.
On the other hand, your argument essentially says that if your mother was a a Boston Celtics fan who birthed you because she was 99.9% certain you'd support the Boston Celtics, then even if you hate both her and the Celtics you must nonetheless support them, because you value your existence.
Or if your parents birthed you because they were 99.9% certain you'd be an Islamist jihadi, you must therefore go jihad. Even if you hate them, even if you don't believe in Islam, even if they have become secular atheists in the meantime. Because you value your existence.
That's insane.
You're not doing anything but invoking the concept of some imaginary debt to your ancestors. "We produced you, because we thought you'd act like this, so even if you hate our guts you must act like this, if you value your existence."
Nonsense. This is nothing but a arbitrary deontological demand, that has nothing to do with utility. I will one-box in the normal Newcomb's problem, and I can honorably decide to pay the driver in the Parfit's Hitchhiker's problem, and I can commit to taking Kavka's toxin -- but I have no motivation to commit to one-boxing in this problem. I exist. My existence is not in doubt. And I only have a moral obligation to those that created me under a very limited set of circumstances that don't apply here.
Hence the reference to Transparent Newcomb's*, in which the money is visible and yet, by some decision theories, it is still irrational to two-box. (Similar reasoning pertains to certain time-travel scenarios - is it rational to try and avoid driving if you know you will die in a car crash?)
*The reference:
... (read more)