You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

SapientPearwood comments on Why do theists, undergrads, and Less Wrongers favor one-boxing on Newcomb? - Less Wrong Discussion

15 Post author: CarlShulman 19 June 2013 01:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (299)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 19 June 2013 09:46:20PM 6 points [-]

Adding to your story, it's not just Eliezer Yudkowsky's introduction to Newcomb's problem. It's the entire Bayesian / Less Wrong mindset. Here, Eliezer wrote:

That was when I discovered that I was of the type called 'Bayesian'. As far as I can tell, I was born that way.

I felt something similar when I was reading through the sequences. Everything "clicked" for me - it just made sense. I couldn't imagine thinking another way.

Same with Newcomb's problem. I wasn't introduced to it by Eliezer, but I still thought one-boxing was obvious; it works.

Many Less Wrongers that have stuck around probably have had a similar experience; the Bayesian standpoint seems intuitive. Eliezer's support certainly helps to propagate one-boxing, but LessWrongers seem to be a self-selecting group.

Comment author: [deleted] 25 May 2014 04:13:13PM 0 points [-]

It also helps that most Bayesian decision algorithms actually take on the arg max_a U(a)*P(a) reasoning of Evidential Decision Theory, which means that whenever you invoke your self-image as a capital-B Bayesian you are semi-consciously invoking Evidential Decision Theory, which does actually get the right answer, even if it messes up on other problems.

(Commenting because I got here while looking for citations for my WIP post about another way to handle Newcomb-like problems.)