Jonathan_Graehl comments on Being Half-Rational About Pascal's Wager is Even Worse - Less Wrong

18 Post author: Eliezer_Yudkowsky 18 April 2013 05:20AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (168)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yvain 18 April 2013 06:46:53AM *  14 points [-]

Are you classifying 10% as a Pascal-level probability? How big does a probability have to get before you don't think Pascal-type considerations apply to it?

Are you suggesting that if there was (for example) a ten percent probability of an asteroid hitting the Earth in 2025, we should devote fewer resources to asteroid prediction/deflection than simple expected utility calculations would predict?

Comment author: Jonathan_Graehl 18 April 2013 07:51:46AM 0 points [-]

I didn't like his anecdote, either.

I think you've read him wrong. He's opposed to "don't pay attention to high utility * small probability scenarios", on the basis of heroism.

Comment author: Eliezer_Yudkowsky 18 April 2013 05:04:10PM 5 points [-]

I'm usually fine with dropping a one-time probability of 0.1% from my calculations. 10% is much too high to drop from a major strategic calculation but even so I'd be uncomfortable building my life around one. If this was a very well-defined number as in the asteroid calculation then it would be more tempting to build a big reference class of risks like that one and work on stopping them collectively. If an asteroid were genuinely en route, large enough to wipe out humanity, possibly stoppable, and nobody was doing anything about this 10% probability, I would still be working on FAI but I would be screaming pretty loudly about the asteroid on the side. If the asteroid is just going to wipe out a country, I'll make sure I'm not in that country and then keep working on x-risk.