ArisKatsaris comments on You're in Newcomb's Box - Less Wrong

40 Post author: HonoreDB 05 February 2011 08:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: ArisKatsaris 02 February 2011 09:37:50AM 8 points [-]

You're not answering the problem as it actually stands, you're instead using perceived similarities to argue it's some other problem, or to posit further elements (like simulated versions of yourself) that would affect the situation drastically.

With Newcomb's problem one properly one-boxes. The unknown state of the box is entagled with your decision, so by one-boxing you're acausally affecting the likelihood the non-transparent box has 1.000.000. This works even for Omegas with less than 100% probability of predictive success.

With this problem, your existence is a certain fact. You don't need to entagle anything, because you exist and you'll keep existing -- in any universe where you're actually making a decision, YOU EXIST. You only need to grab two boxes, and you'll have them both with no negative consequences.

This has absolutely NOTHING to do with Quantum suicide. These decisions don't even require a belief in MWI.

On the other hand, your argument essentially says that if your mother was a a Boston Celtics fan who birthed you because she was 99.9% certain you'd support the Boston Celtics, then even if you hate both her and the Celtics you must nonetheless support them, because you value your existence.

Or if your parents birthed you because they were 99.9% certain you'd be an Islamist jihadi, you must therefore go jihad. Even if you hate them, even if you don't believe in Islam, even if they have become secular atheists in the meantime. Because you value your existence.

That's insane.

You're not doing anything but invoking the concept of some imaginary debt to your ancestors. "We produced you, because we thought you'd act like this, so even if you hate our guts you must act like this, if you value your existence."

Nonsense. This is nothing but a arbitrary deontological demand, that has nothing to do with utility. I will one-box in the normal Newcomb's problem, and I can honorably decide to pay the driver in the Parfit's Hitchhiker's problem, and I can commit to taking Kavka's toxin -- but I have no motivation to commit to one-boxing in this problem. I exist. My existence is not in doubt. And I only have a moral obligation to those that created me under a very limited set of circumstances that don't apply here.

Comment author: MugaSofer 15 January 2013 11:24:03AM *  -2 points [-]

With Newcomb's problem one properly one-boxes. The unknown state of the box is entagled with your decision, so by one-boxing you're acausally affecting the likelihood the non-transparent box has 1.000.000.

Hence the reference to Transparent Newcomb's*, in which the money is visible and yet, by some decision theories, it is still irrational to two-box. (Similar reasoning pertains to certain time-travel scenarios - is it rational to try and avoid driving if you know you will die in a car crash?)

*The reference:

For others, it's easy because you take both boxes in the variant of Newcomb where the boxes are transparent and you can see the million dollars; just as you would know that you had the million dollars no matter what, in this case you know that you exist no matter what.

EDIT: whoops, ninja'd. By almost two years.

Do you still two-box in this situation?

Comment author: ArisKatsaris 15 January 2013 11:55:41AM *  2 points [-]

Do you still two-box in this situation?

I've since decided that one-boxing in Transparent Newcomb is the correct decision -- because being the sort of agent that one-boxes is to be the sort of agent that gets given more frequently a filled first box (I think I only fully realized this after reading Eliezer's paper on TDT, which I hadn't at the time of this thread).

So the individual "losing" decision is actually part of a decision theory which is winning *overall". And is therefore the correct decision no matter how counterintuitive.

Mind you, as a practical matter, I think it's significantly harder for a human to choose to one-box in the case of Transparent Newcomb. I don't know if I could manage it if I was actually presented with the situation, though I don't think I'd have a problem with the case of classical Newcomb.