Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Michael_Rooney comments on Knowing About Biases Can Hurt People - Less Wrong

70 Post author: Eliezer_Yudkowsky 04 April 2007 06:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (79)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Michael_Rooney 06 April 2007 08:35:39PM 0 points [-]

Eliezer, I agree that exactly even balances of evidence are rare. However, I would think suspending judgment to be rational in many situations where the balance of evidence is not exactly even. For example, if I roll a die, it would hardly be rational to believe "it will not come up 5 or 6", despite the balance of evidence being in favor of such a belief. If you are willing to make >50% the threshold of rational belief, you will hold numerous false and contradictory beliefs.

Also, I have some doubt about your claim that when "there is no evidence in favor of a complicated proposed belief, it is almost always correct to reject it". If you proposed a complicated belief of 20th century physics (say, Bell's theorem) to Archimedes, he would be right to say he has no evidence in its favor. Nonetheless, it would not be correct for Archimedes to conclude that Bell's theorem is therefore false.

Perhaps I am misunderstanding you.

Comment author: DanielLC 27 December 2009 07:04:34AM 2 points [-]

If you gave him almost anything else that complex, it actually would be false. Once something gets even moderately complex, there is a huge number of other things that complex.

Technically, he should figure that there's just a one in 10^somethingorother chance that it's true, but you can't remember all 10^somethingorother things that are that unlikely, so you're best off to reject it.

Comment author: bigjeff5 21 February 2011 11:40:17PM *  4 points [-]

For example, if I roll a die, it would hardly be rational to believe "it will not come up 5 or 6", despite the balance of evidence being in favor of such a belief.

A Bayesian would not say definitively that it would not come up as 5 or 6. However, if you were to wager on whether or not the dice will come up as either 5 or 6, the only rational position is to bet against it. Given enough throws of the die, you will be right 2/3 of the time.

At the most basic level, the difference between Bayesian reasoning and traditional rationalism is a Bayesian only thinks in terms in likelihoods. It's not a matter of "this position is at a >50% probability, therefore it is correct", it is a matter of "this position is at a >50% probability, so I will hold it to be more likely correct than incorrect until that probability changes".

It's a difficult way of thinking, as it doesn't really allow you to definitively decide anything with perfect certainty. There are very few beliefs in this world for which a 100% probability exists (there must be zero evidence against a belief for this to occur). Math proofs, really, are the only class of beliefs that can hold such certainty. As such the possibility of being wrong pretty much always exists, and must always be considered, though by how much depends on the likelihood of the belief being incorrect.

If you proposed a complicated belief of 20th century physics (say, Bell's theorem) to Archimedes, he would be right to say he has no evidence in its favor.

If no evidence is given for the belief, of course he is right to reject it. It is the only rational position Archimedes can take. Without evidence, Archimedes must assign a 0%, or near 0%, probability to the likelihood that the 20th century position is correct. However, if he is presented with the evidence for which we now believe such things, his probability assignment must change, and given the amount of evidence available it would be irrational to reject it.

Just because you were wrong does not mean you were thinking irrationally. The converse of that is also true: just because you were right does not mean you were thinking rationally.

Also note that it is a fairly well known fact that 20th century physics are broken - i.e. incorrect, or at least not completely correct. We simply have nothing particularly viable to supersede them with yet, so we are stuck until we find the more correct theories of physics. It would be pretty funny to convince Archimedes of their correctness, only to follow it up with all the areas where modern physics break down.

Comment author: wedrifid 22 February 2011 01:49:17AM 7 points [-]

However, if you were to wager on whether or not the dice will come up as either 5 or 6, the only rational position is to bet against it.

You need to specify even odds. Bayesians will bet on just about anything if the price is right.

Comment author: bigjeff5 22 February 2011 11:36:38PM 3 points [-]

Odds on dice are usually assumed even unless specified otherwise, but it's never wrong to specify it, so thanks.

Comment author: wedrifid 23 February 2011 02:27:33AM 0 points [-]

Odds on dice are usually assumed even unless specified otherwise

On the other hand when considering rational agency some come very close to defining 'probability' based on what odds would be accepted for bets on specified events.

Comment author: JGWeissman 22 February 2011 11:58:58PM 5 points [-]

There are very few beliefs in this world for which a 100% probability exists

There are none.

Comment author: bigjeff5 23 February 2011 12:57:49AM 4 points [-]

Thanks, I was a little unsure of stating that there is no such thing as 100% probability. That post is very helpful.

Comment author: raylance 27 August 2011 01:30:03AM 1 point [-]

Ah, the Godelian "This sentence is false."

Comment author: encounterpiyush 10 March 2013 04:53:48AM *  -1 points [-]

It would be irrational to believe "it will not come up 5 or 6" because P(P(5 or 6) = 0) = 0, so you know for certain that its false. As you said "Claims about the probability of a given claim being true, helpful as they may be in many cases, are distinct from the claim itself." Before taking up any belief (if the situation demands taking up a belief, like in a bet, or living life), a Bayesian would calculate the likelihood of it being true vs the likelihood of it being false, and will favour the higher likelihood. In this case, the likelihood that "it will not come up 5 or 6" is true is 0, so a Bayesian would not take up that position. Now, you might observe that the belief that "1,2,3 or 4 will come up" is true also holds holds the likelihood of zero. In the case of a dice role, any statement of this form will be false, so a Bayesian will take up beliefs that talk probabilities and not certainties . (As Bigjeff explains, "At the most basic level, the difference between Bayesian reasoning and traditional rationalism is a Bayesian only thinks in terms in likelihoods")

Ofcourse, one can always say "I don't know", but saying "I don't know" would have an inferior utility in life than being a Bayesian. So, for example, assume that your life depends on a series of dice rolls. You can take two positions: 1) You say "I believe I don't know what the outcome would be" on every roll. 2) You bet on every dice roll according to the information you have (in other words, You say "I believe that outcome X has Y chance of turning up". Both positions would be of course be agreeable, but the second position would give you a higher payoff in life. Or so Bayesians believe.