David_Gerard comments on Bayesian Adjustment Does Not Defeat Existential Risk Charity - Less Wrong

43 Post author: steven0461 17 March 2013 08:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (89)

You are viewing a single comment's thread. Show more comments above.

Comment author: ArisKatsaris 17 April 2013 03:39:10PM *  3 points [-]

"Just to be clear: are we saying that a factor of 3^^^3 is a Pascal's mugging, but a factor of 10^30 isn't?"

No. The problem with Pascal's mugging doesn't lie merely in the particular hoped-for payoff, it's that in extreme combinations of small chance/large payoff, the complexity of certain hypotheses doesn't seem sufficient to adequately (as per our intuitions) penalize said hypotheses.

If I said "give me a dollar, and I'll use my Matrix Lord powers to have three dollars appear in your wallet", someone can simply respond that the chances of me being a Matrix Lord is less than one in three, so the expected payoff is less than the cost. But we don't yet to have a clear, mathematically precise way to explain why we should also respond negatively to "give me a dollar, and I'll use my Matrix Lord powers to save 3^^^3 lives.", even though our intuition says we should (and in this case we trust our intuition).

To put it in brief: Pascal's Mugging is a interesting problem regarding decision theory which LessWrongers should be hoping to solve (I have an idea towards that direction, which I'm writing a discusion post about, but I'd need mathematicians to tell me if it potentially leads to anything); not just a catchphrase you can use to bash someone else's calculations when their intuitions differs from yours.

Comment author: David_Gerard 17 April 2013 09:31:33PM 2 points [-]

I certainly consider that if you multiply a very tiny probability by a huge payoff and then expect others to take your calculation seriously as a call to action, you're being silly, however it's labeled. Humans can't even consider very tiny probabilities without privileging the hypothesis.

Comment author: private_messaging 17 April 2013 09:46:38PM *  2 points [-]

Note also that a crazy mugger could demand $10 or else 10^30 people outside the matrix will die, and then argue that you should rationally trust him 100% so the figure is 10^29 lives/$ , or argue that it is 90% certain that those people will die because he's a bit uncertain about the danger in the alternate worlds, or the like. It's not about the probability which mugger estimates, it's about the probability that the typical payer estimates.