PhilGoetz comments on Bayesians vs. Barbarians - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (270)
You didn't mention in the Newcomb's Problem article that you're a one-boxer.
As a die-hard two-boxer, perhaps someone can explain one-boxing to me. Let's say that Box A contains money to save 3 lives (if Omega thinks you'll take it only) or nothing, and Box B contains money to save 2 lives. Conditional on this being the only game Omega will ever play with you, why the hell would you take Box A only?
I suspect what all you one-boxers are doing is that you somehow believe that a scenario like this one will actually occur, and you're trying to broadcast your intent to one-box so Omega will put money in for you.
You can find several long discussions of this on Overcoming Bias, and in earlier posts on Less Wrong.