cousin_it comments on You're in Newcomb's Box - Less Wrong

40 Post author: HonoreDB 05 February 2011 08:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: ArisKatsaris 07 February 2011 03:41:31PM -1 points [-]

Well, here's the paradox: strict one-boxers in transparent Newcomb argue that they must one-box always, even when the box is empty, and therefore the boxes will be full.

Not just that, they argue that they must one-box always, even when the box is empty, BECAUSE then the box will be full.

Is that actually committment, or is that just doublethink, ability to hold two contradictory ideas at the same time? How can you commit to taking a course of action (grabbing an empty box) in order to make that course of action (grabbing an empty box) impossible?

And yeah, I'm sure I'd lose at playing transparent Newcomb, but I'm not sure that anyone but a master of doublethink could win it.

Comment author: Wei_Dai 07 February 2011 09:31:38PM *  4 points [-]

Well, here's the paradox: strict one-boxers in transparent Newcomb argue that they must one-box always, even when the box is empty, and therefore the boxes will be full.

No, they argue that they must one-box always, even when they think they see the box is empty.

The argument is that you can't do the Bayesian update P(the box is empty | I see the box as empty) = 1, because Bayesian updating in general fails to "win" when there are other copies of you in the same world, or when others can do source-level predictions of you. Instead, you should use Updateless Decision Theory.

BTW, I don't think UDT is applicable to most human decisions (or rather, it probably tells you to do the same things as standard decision theory), including things like voting or contributing to charity, or deciding whether to have children, because I think logical correlations between ordinary humans are probably pretty low. (That's just an intuition though since I don't know how to do the calculations.)

Comment author: cousin_it 08 February 2011 11:01:39AM *  0 points [-]

Ordinary correlations between ordinary humans seem to be pretty high. Do they suffice for our needs? I'm not sure...