Morendil comments on Other Existential Risks - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (120)
I can see how one might balk at this, but I don't think it's an "overinterpretation".
What strikes me as fatuous is the need to assign actual numbers to propositions, such that one would say "I think there is a 4.3% probability of us getting wiped out by an asteroid".
But you can refrain from this kind of silliness even as you admit that probabilities must be real numbers, and that therefore it makes sense to think of various propositions, no matter how fuzzily defined, in terms of your ranking of their plausibilities. One consequence of the Bayesian model is that plausibilities are comparable.
So you can certainly list out the know risks, and for each of them ask the question: "What are my reasons for ranking this one as more or less likely than this other?" You may not end up with precise numbers, but that's not the point. The point is to think through the precise components of your background knowledge that go into your assessment, doing your best to mitigate bias whenever possible.
The objective, and I think it's achievable, is to finish with a better reasoned position than you had on starting the procedure.
The mistake here is not the number but the way of saying it: as if this is your guess at the value of a number out there in the world. Better to say
"My subjective probability of an asteroid strike wiping us out is currently 4.3%"
though of course the spurious precision of the ".3" would be more obviously silly in such a context.