Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Tyrrell_McAllister comments on What Bayesianism taught me - Less Wrong

62 Post author: Tyrrell_McAllister 12 August 2013 06:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (203)

You are viewing a single comment's thread. Show more comments above.

Comment author: Tyrrell_McAllister 11 August 2013 04:23:10PM *  9 points [-]

These biases are often known to work even when you're aware of them and trying to counteract them.

This is the problem. I know, as an epistemic matter of fact, that anecdotes are evidence. I could try to ignore this knowledge, with the goal of counteracting the biases to which you refer. That is, I could try to suppress the Bayesian update or to undo it after it has happened. I could try to push my credence back to where it was "manually". However, as you point out, counteracting biases in this way doesn't work.

Far better, it seems to me, to habituate myself to the fact that updates can by miniscule. Credence is quantitative, not qualitative, and so can change by arbitrarily small amounts. "Update Yourself Incrementally". Granting that someone has evidence for their claims can be an arbitrarily small concession. Updating on the evidence doesn't need to move my credences by even a subjectively discernible amount. Nonetheless, I am obliged to acknowledge that the anecdote would move the credences of an ideal Bayesian agent by some nonzero amount.

Comment author: Lumifer 12 August 2013 05:20:10PM *  2 points [-]

...updates can by miniscule ... Updating on the evidence doesn't need to move my credences by even a subjectively discernible amount. Nonetheless, I am obliged to acknowledge that the anecdote would move the credences of an ideal Bayesian agent by some nonzero amount.

So, let's talk about measurement and detection.

Presumably you don't calculate your believed probabilities to the n-th significant digit, so I don't understand the idea of a "miniscule" update. If it has no discernible consequences then as far as I am concerned it did not happen.

Let's take an example. I believe that my probability of being struck by lightning is very low to the extent that I don't worry about it and don't take any special precautions during thunderstorms. Here is an anecdote which relates how a guy was stuck by lightning while sitting in his office inside a building. You're saying I should update my beliefs, but what does it mean?

I have no numeric estimate of P(me being struck by lightning) so there's no number I can adjust by 0.0000001. I am not going to do anything differently. My estimate of my chances to be electrocuted by Zeus' bolt is still "very very low". So where is that "miniscule update" that you think I should make and how do I detect it?

P.S. If you want to update on each piece of evidence, surely by now you must fully believe that product X is certain to enlarge your penis?