Wei_Dai comments on Selection Effects in estimates of Global Catastrophic Risk - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (64)
I think people should discount risk estimates fairly heavily when an organisation is based around doom mongering. For instance, The Singularity Institute, The Future of Humanity Institute and the Bulletin of the Atomic Scientists all seem pretty heavily oriented around doom. Such organisations initially attract those with high risk estimates, and they then actively try and "sell" their estimates to others.
Obtaining less biased estimates seems rather challenging. The end of the would would obviously be an unprecidented event.
The usual way of eliciting probability is with bets. However, with an apocalypse, this doesn't work too well. Attempts to use bets have some serious problems.
That's why I refuse to join SIAI or FHI. If I did, I'd have to discount my own risk estimates, and I value my opinions too much for that. :)