William comments on Bayesians vs. Barbarians - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (270)
You didn't mention in the Newcomb's Problem article that you're a one-boxer.
As a die-hard two-boxer, perhaps someone can explain one-boxing to me. Let's say that Box A contains money to save 3 lives (if Omega thinks you'll take it only) or nothing, and Box B contains money to save 2 lives. Conditional on this being the only game Omega will ever play with you, why the hell would you take Box A only?
I suspect what all you one-boxers are doing is that you somehow believe that a scenario like this one will actually occur, and you're trying to broadcast your intent to one-box so Omega will put money in for you.
I can choose through the composition of my mind to save 3 lives by wanting to refuse to take the money to save 2 lives. Or I can choose to save the two lives and thus not get 3 lives. Why the hell would I take both boxes?
I guess that makes sense. If you have the option of choosing what the composition of your mind is.
"Composition of my mind" is a bad phrase for it, but what I mean is that I have a collection of neurons that say "I'm a one-boxer" or similar.