taw comments on The mind-killer - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (151)
You can't get away with having such extreme probabilities when a bunch of smart and rational people disagree. There are reasons why the whole Aumann agreement thing doesn't work perfectly in real life, but this is an extreme failure.
If a bunch of people on LW think it's only 50% likely and you think theres only a 0.1% chance that they're right and you're wrong (which is already ridiculously low) it still brings your probability estimate down to around 99.95. This is a 50 fold increase in the probability that the world is going to end over what you stated. Either you have some magic information that you haven't shared, or you're hugely overconfident.
http://lesswrong.com/lw/9x/metauncertainty/ http://lesswrong.com/lw/3j/rationality_cryonics_and_pascals_wager/69t#comments
You cannot selectively apply Aumann agreement. If you want to count tiny bunch of people who believe in AI foom, you must also take into account 7 billion people, many of them really smart, who definitely don't.
I don't have this problem, as I don't really believe that using Aumann agreement is useful with real humans.
Or you could count my awareness of insider overconfidence as magic information:
http://www.overcomingbias.com/2007/07/beware-the-insi.html
This is Less Wrong we're talking about. Insider overconfidence isn't "magic information".
See my top level post for a full response.