Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

NancyLebovitz comments on Knowing About Biases Can Hurt People - Less Wrong

70 Post author: Eliezer_Yudkowsky 04 April 2007 06:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (79)

Sort By: Old

You are viewing a single comment's thread.

Comment author: NancyLebovitz 15 September 2012 06:12:01PM 4 points [-]

"For a true Bayesian, information would never have negative expected utility."

Is this true in general? It seems to me that if a Bayesian has limited information handling ability, then they need to give some thought (not too much!) to the risks of being swamped with information and of spending too many resources on gathering information.

Comment author: alex_zag_al 15 September 2012 07:41:11PM 1 point [-]

Yeah, certainly. The search might be expensive. Or, some of its resources might be devoted to distinguishing the most relevant among the information it receives - diluting its input with irrelevant truths makes it work harder to find what's really important.

An interpretation of the original statement that I think is true, though, is that in all these cases, receiving the information and getting a little more knowledgeable offsets the negative utility of whatever price was paid for it. The negative utility of the combination of search+learning is always negative because of the searching part of it - if you kept the searching but removed the learning at the end, it'd be even worse.

Comment author: beoShaffer 15 September 2012 08:20:10PM *  4 points [-]

if a Bayesian has limited information handling ability

I believe that in this situation "true Bayesian" implies unbounded processing power/ logical omniscience.

Comment author: NancyLebovitz 16 September 2012 01:15:11AM *  0 points [-]

I suggest that "true Bayesian" is ambiguous enough (this seems to use it in the sense of a human using the principles of Bayes) that some other phrase-- perhaps "unlimited Bayesian"-- would be clearer.

Comment author: [deleted] 15 September 2012 08:41:50PM 3 points [-]

The cost of gathering or processing the information may exceed the value of information, but the information is always positive value; At worst, you do nothing different, and the rest of the time you make a more informed choice.

Comment author: TheOtherDave 15 September 2012 10:30:48PM -1 points [-]

I'm not exactly sure what "a true Bayesian" refers to, if anything, but it's possible that being whatever that is precludes having limited information handling ability.

Comment author: RichardKennaway 15 September 2012 10:42:43PM 2 points [-]

Is this true in general?

Yes, in this technical sense.

It seems to me that if a Bayesian has limited information handling ability

A true Bayesian has unlimited information handling ability.

Comment author: alex_zag_al 16 September 2012 12:32:19AM 1 point [-]

A true Bayesian has unlimited information handling ability.

I think I see that - because if it didn't, then not all of its probabilities would be properly updated, so its degrees of belief wouldn't have the relations implied by probability theory, so it wouldn't be a true Bayesian. Right?

Comment author: RichardKennaway 16 September 2012 10:00:21AM 2 points [-]

Yes, one generally ignores the cost of making these computations. One might try to take it into account, but then one is ignoring the cost of doing that computation, etc. Historically, the "Bayesian revolution" needed computers before it could happen.

And, I notice, it has only gone as far as the computers allow. "True Bayesians" also have universal priors, that assign non-zero probability density to every logically possible hypothesis. Real Bayesian statisticians never do this; all those I have read deny that it is possible.

Comment author: Eugine_Nier 16 September 2012 06:53:57PM 1 point [-]

And, I notice, it has only gone as far as the computers allow. "True Bayesians" also have universal priors, that assign non-zero probability density to every logically possible hypothesis. Real Bayesian statisticians never do this; all those I have read deny that it is possible.

It is impossible, even in principal. The only way to have universal priors over all computable universes is if you have access to a source of hypercomputation, but that would mean the universe isn't computable so the truth still isn't in your prior set.

Comment author: RichardKennaway 16 September 2012 06:56:24PM 0 points [-]

Is that written up as a theorem anywhere?

Comment author: Eugine_Nier 18 September 2012 12:50:51AM 0 points [-]

That depends on how one wants to formalize it.