Stuart_Armstrong comments on Fundamentals of kicking anthropic butt - Less Wrong

18 Post author: Manfred 26 March 2012 06:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (60)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 26 March 2012 10:19:23PM *  4 points [-]

I've always thought of SSA and SIA as assumptions that depend on what your goal is in trying to figure out the probability. Sleeping Beauty may want to maximize the probability that she guesses the coin correctly at least once, in which cases she should use the probability 1/2. Or she may want to maximize the number of correct guesses, in which case she should use the probability 2/3.

In either case, asking "but what's the probability, really?" isn't helpful.

Edit: in the second situation, Sleeping Beauty should use the probability 2/3 to figure out how to maximize the number of correct guesses. This doesn't mean she should guess T 2/3 of the time -- her answer also depends on the payouts, and in the simplest case (she gets $1 for every correct guess) she should be guessing T 100% of the time.

Comment author: Stuart_Armstrong 05 April 2012 10:37:30AM 1 point [-]

In either case, asking "but what's the probability, really?" isn't helpful.

Strongly agree. My paper here: http://arxiv.org/abs/1110.6437 takes the problem apart and considers the different components (utilities, probabilities, altruism towards other copies) that go into a decision, and shows you can reach the correct decision without worrying about the probabilities at all.

Comment author: DanielLC 04 July 2012 03:47:34AM -1 points [-]

You're wondering whether or not to donate to reduce existential risks. You won't donate if you're almost certain the world will end soon either way. You wake up as the 100 billionth person. Do you use this information to update on the probability that there will only be on the order of 100 billion people, and refrain from donating?