William comments on Bayesians vs. Barbarians - Less Wrong

51 Post author: Eliezer_Yudkowsky 14 April 2009 11:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (270)

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Maxwell_IV 15 April 2009 11:48:07PM 0 points [-]

You didn't mention in the Newcomb's Problem article that you're a one-boxer.

As a die-hard two-boxer, perhaps someone can explain one-boxing to me. Let's say that Box A contains money to save 3 lives (if Omega thinks you'll take it only) or nothing, and Box B contains money to save 2 lives. Conditional on this being the only game Omega will ever play with you, why the hell would you take Box A only?

I suspect what all you one-boxers are doing is that you somehow believe that a scenario like this one will actually occur, and you're trying to broadcast your intent to one-box so Omega will put money in for you.

Comment author: William 16 April 2009 08:53:12PM 0 points [-]

I can choose through the composition of my mind to save 3 lives by wanting to refuse to take the money to save 2 lives. Or I can choose to save the two lives and thus not get 3 lives. Why the hell would I take both boxes?

Comment author: John_Maxwell_IV 17 April 2009 05:19:49PM 0 points [-]

I guess that makes sense. If you have the option of choosing what the composition of your mind is.

Comment author: William 19 April 2009 03:27:44AM 0 points [-]

"Composition of my mind" is a bad phrase for it, but what I mean is that I have a collection of neurons that say "I'm a one-boxer" or similar.