taw comments on The mind-killer - Less Wrong

23 Post author: ciphergoth 02 May 2009 04:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (151)

You are viewing a single comment's thread. Show more comments above.

Comment author: jimmy 04 May 2009 12:31:58AM *  5 points [-]

You can't get away with having such extreme probabilities when a bunch of smart and rational people disagree. There are reasons why the whole Aumann agreement thing doesn't work perfectly in real life, but this is an extreme failure.

If a bunch of people on LW think it's only 50% likely and you think theres only a 0.1% chance that they're right and you're wrong (which is already ridiculously low) it still brings your probability estimate down to around 99.95. This is a 50 fold increase in the probability that the world is going to end over what you stated. Either you have some magic information that you haven't shared, or you're hugely overconfident.

http://lesswrong.com/lw/9x/metauncertainty/ http://lesswrong.com/lw/3j/rationality_cryonics_and_pascals_wager/69t#comments

Comment author: taw 04 May 2009 09:45:50AM *  1 point [-]

You cannot selectively apply Aumann agreement. If you want to count tiny bunch of people who believe in AI foom, you must also take into account 7 billion people, many of them really smart, who definitely don't.

I don't have this problem, as I don't really believe that using Aumann agreement is useful with real humans.

Or you could count my awareness of insider overconfidence as magic information:

http://www.overcomingbias.com/2007/07/beware-the-insi.html

Comment author: jimmy 05 May 2009 07:08:09AM 0 points [-]

This is Less Wrong we're talking about. Insider overconfidence isn't "magic information".

See my top level post for a full response.