jimmy comments on The mind-killer - Less Wrong

23 Post author: ciphergoth 02 May 2009 04:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (151)

You are viewing a single comment's thread. Show more comments above.

Comment author: jimmy 04 May 2009 12:31:58AM *  5 points [-]

You can't get away with having such extreme probabilities when a bunch of smart and rational people disagree. There are reasons why the whole Aumann agreement thing doesn't work perfectly in real life, but this is an extreme failure.

If a bunch of people on LW think it's only 50% likely and you think theres only a 0.1% chance that they're right and you're wrong (which is already ridiculously low) it still brings your probability estimate down to around 99.95. This is a 50 fold increase in the probability that the world is going to end over what you stated. Either you have some magic information that you haven't shared, or you're hugely overconfident.

http://lesswrong.com/lw/9x/metauncertainty/ http://lesswrong.com/lw/3j/rationality_cryonics_and_pascals_wager/69t#comments

Comment author: taw 04 May 2009 09:45:50AM *  1 point [-]

You cannot selectively apply Aumann agreement. If you want to count tiny bunch of people who believe in AI foom, you must also take into account 7 billion people, many of them really smart, who definitely don't.

I don't have this problem, as I don't really believe that using Aumann agreement is useful with real humans.

Or you could count my awareness of insider overconfidence as magic information:

http://www.overcomingbias.com/2007/07/beware-the-insi.html

Comment author: jimmy 05 May 2009 07:08:09AM 0 points [-]

This is Less Wrong we're talking about. Insider overconfidence isn't "magic information".

See my top level post for a full response.

Comment author: homunq 14 May 2009 04:47:24PM 0 points [-]

Large groups of smart people are frequently wrong about the future, and overwhelmingly so about the non-immediate future. 0.1% may be low but it's not ridiculously so.

(Also "they're right and you're wrong" is redundant. This has nothing to do with any set of scenario probabilities being "right". And any debate of "p=.9" "no, p=.1" is essentially silly because it misunderstands both the meaning of probability as a function of knowledge and our ability to create models which give meaningfully-accurate probabilities.)

Comment author: Vladimir_Nesov 14 May 2009 05:01:35PM *  0 points [-]

And any debate of "p=.9" "no, p=.1" is essentially silly because it misunderstands both the meaning of probability as a function of knowledge and our ability to create models which give meaningfully-accurate probabilities.

Subjective probability is (in particular) a tool for elicitation of model parameters from expert human gut-feelings, which you can then use to find further probabilities and align them with other gut-feelings and decisions, gaining precision from redundancy and removing inconsistencies. The subjective probabilities don't promise to immediately align with physical frequencies, even where the notion makes sense.

It is a well-studied and useful process, you'd need a substantially more constructive reference than "it's silly" (or you could just seek a reasonable interpretation).

Comment author: homunq 14 May 2009 05:58:36PM 1 point [-]

As you explain it, it's not silly.

Do you have a link for a top-level post that puts this kind of caveat on probability assignments? Personally, I think that if most people here understood it that way, they'd use more qualified language when talking about subjective probability. I also think that developing and standardizing such qualified language would be a useful project.

Comment author: Vladimir_Nesov 14 May 2009 08:46:12PM *  0 points [-]

It is the sense in which the term "probability" is generally understood on OB/LW, with varying levels of comprehension by specific individuals. There are many posts on probability, both as an imprecise tool and an ideal (but subjective) construction. They should probably be organized in the Bayesian probability article on the wiki. In the meantime, you are welcome to look for references in the Overcoming Bias archives.

You may be interested in the following two posts, related to this discussion:
Probability is in the Mind
When (Not) To Use Probabilities