I was recently reading about the Transparent Newcomb with your Existence at Stake problem, which, to make a long story short, states that you were created by Prometheus, who foresaw that you would one-box on Newcomb's problem and wouldn't have created you if he had foreseen otherwise. The implication is that you might need to one-box just to exist. It's a disturbing problem, and as I read it another even more disturbing problem started to form in my head. However, I'm not sure it's logically coherent (I'm really hoping it's not) and wanted to know what the rest of you thought. The problem goes:
One day you start thinking about a hypothetical nonexistant person named Bob who is a real jerk. If he existed he would make your life utterly miserable. However, if he existed he would want to make a deal with you. If he ever found himself existing in a universe where you have never existed he would create you, on the condition that if you found yourself existing in a universe where he had never existed you would create him. Hypothetical Bob is very good at predicting the behavior of other people, not quite Omega quality, but pretty darn good. Assume for the sake of the argument that you like your life and enjoy existing.
At first you dismiss the problem because of technical difficulties. Science hasn't advanced to the point where we can make people with such precision. Plus, there is a near-infinite number of far nicer hypothetical people who would make the same deal, when science reaches that point you should give creating them priority.
But then you see Omega drive by in its pickup truck. A large complicated machine falls off the back of the truck as it passes you by. Written on it, in Omega's handwriting, is a note that says "This is the machine that will create Bob the Jerk, a hypothetical person that [insert your name here] has been thinking about recently, if one presses the big red button on the side." You know Omega never lies, not even in notes to itself.
Do Timeless Decision Theory and Updateless Decision Theory say you have a counterfactual obligation to create Bob the Jerk, the same way you have an obligation to pay Omega in the Counterfactual Mugging, and the same way you might (I'm still not sure about this) have an obligation to one-box when dealing with Prometheus? Does this in turn mean that when we develop the ability to create people from scratch we should tile the universe with people who would make the counterfactual deal? Obviously it's that last implication that disturbs me.
I can think of multiple reasons why it might not be rational to create Bob the Jerk:
- It might not be logically coherent to not update to acknowledge the fact of your own existence, even in UDT (this also implies one should two-box when dealing with Prometheus).
- An essential part of who you are is the fact that you were created by your parents, not by Bob the Jerk, so the counterfactual deal isn't logically coherent. Someone he creates wouldn't be you, it would be someone else. At his very best he could create someone with a very similar personality who has falsified memories, which would be rather horrifying.
- An essential part of who Bob the Jerk is is that he was created by you, with some help from Omega. He can't exist in a universe where you don't, so the hypothetical bargain he offered you isn't logically coherent.
- Prometheus will exist no matter what you do in his problem, Bob the Jerk won't. This makes these two problems qualitatively different in some way I don't quite understand.
- You have a moral duty to not inflict Bob the Jerk on others, even if it means you don't exist in some other possibility.
- You have a moral duty to not overpopulate the world, even if it means you might not exist in some other possibility, and the end result of the logic of this problem implies overpopulating the world.
- Bob the Jerk already exists because we live in a Big World, so you have no need to fulfill your part of the bargain because he's already out there somewhere.
- Making these sorts of counterfactual deals is individually rational, but collectively harmful in the same way that paying a ransom is. If you create Bob the Jerk some civic-minded vigilante decision theorist might see the implications and find some way to punish you.
- While it is possible to want to keep on existing if you already exist, it isn't logically possible to "want to exist" if you don't already, this defeats the problem in some way.
- After some thought you spend some time thinking about a hypothetical individual called Bizarro-Bob. Bizarro-Bob doesn't want Bob the Jerk to be created and is just as good at modeling your behavior as Bob the Jerk is. He has vowed that if he ends up existing in a universe where you'll end up creating Bob the Jerk he'll kill you. As you stand by Omega's machine you start looking around anxiously for the glint of light off a gun barrel.
- I don't understand UDT or TDT properly, they don't imply I should create Bob the Jerk for some other reason I haven't thought of because of my lack of understanding.
Are any of these objections valid, or am I just grasping at straws? I find the problem extremely disturbing because of its wider implications, so I'd appreciate it if someone with a better grasp of UDT and TDT analyzed it. I'd very much like to be refuted.
I think one boxing is the right response to newcombs problem but I don't see any reason to one box as a creation of prometheus or create bob the jerk. I would two box in that prometheus problem if I understand correctly that that would net me an extra hundred dollars (and prometheus won't hunt down "defective" creations). I'm saying this because maybe that means I just don't understand something or because there's an implicit wrong step that I'm too inferentially removed from to make or figure out what it is.
Anyway onto what might be wrong with your reasoning.
I'm putting what I think is the main thing you are wrong about at the front having stumbled across something I now think is what you're wrong about but I'm still gonna leave the rest in.
The thing I think you're wrong about:
Unless blueprint generation is done by models for individual actual people being psychically transferred (whole) onto blueprints across the multiverse or into blueprint makers minds (e.g. bob or jack or prometheus) there's no reason what exactly you, personally, choose to do, should effect what Blueprint Bob or jack or prometheus come up with. Bob can just make a deal with you-except-will-make-deal-with-bob or any of the other limitless people he would make a deal with. It sounds like you think what you do changes what blueprints bob/prometheus choose from. This isn't shorthand. This is just backwards.
"The implication is that you might need to one-box just to exist." If you already exist you can't need to one box to exist.
Reguarding bob. Why is he any more likely to exist than jack, who will only create you if you won't create him (and is a nice guy to boot) if a jack creation machine falls off omega's pickup truck? Those possibilities seem to be opposite. Are they equal? (And are they so low that even if being a bob creator gives you a better shot of existing it's not worth making bob)
Quality over quantity? Is it worth the increased chance of existence (if there is any) to have bob around?
Do you really value like-you-ness? Given the chance will you tile the universe with you-clones? Are you going to donate sperm en masse and/oror have kids and raise them to be like you?
Won't bob just make a deal with you-except-for-will-make-bob if you won't make bob? Will or won't make bob is not an essential property of you-ness right? It seems to be something that could go either way rather than a necessarry consequence of the type of person you are, which is presumably what you might value.
"you might still want to create him to "guarantee" the life you had before he was around." You've already had the life you had before he was around. You won't guarantee anyone else having the life you live before you're around because he won't create you in the same circumstances. Unless you mean to increase the likelyhood of your memories existing in which case you can create a person with your memories anyway (if this was ever somehow real as opposed to omega driving by in a pickup truck.
ok so the above organised:
you probably don't value people like you existing or at least not in all cases. e.g. if you are created by a jerk who makes your life a net negative. There's no way blueprints are generated by picking from actual existing people in other brances of a multiverse. You have no influence on what counterfactual you hypothetical bobs might pick from design space. No information transferral. If it's an actual matter of cloning there might be other problems.
also,
"•An essential part of who you are is the fact that you were created by your parents, not by Bob the Jerk, so the counterfactual deal isn't logically coherent. Someone he creates wouldn't be you, it would be someone else."
The first sentence is an unlikely definition of "you" or can be stipulated to be false. The second is true reguardless for my definition of you. If you're talking to one of two clones that one is "you" and the other one is "him," right? A clone of you is always someone else (is the way I see it.)
I mean it as just one part of a much larger definition that includes far more things than just that. The history of how I was created obviously affects who I am.
Of course, it's possible that Bob the Jerk really created me and then I was secretly adopted. But if that's the case then he already exists in this universe, so I have no obligation to create him regardless.
If an essential part of who Bob the Jerk is is that he was a member of the KKK in 1965, doe... (read more)