Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Tyrrell_McAllister comments on What Bayesianism taught me - Less Wrong

62 Post author: Tyrrell_McAllister 12 August 2013 06:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (203)

You are viewing a single comment's thread. Show more comments above.

Comment author: Tyrrell_McAllister 11 August 2013 03:33:30AM *  7 points [-]

Anecdotal evidence is filtered evidence.

Right, the existence of the anecdote is the evidence, not the occurrence of the events that it alleges.

You can find people saying anecdotes on any side of a debate, and I see no reason the people who are right would cite anecdotes more.

It is true that, if a hypothesis has reached the point of being seriously debated, then there are probably anecdotes being offered in support of it. (... assuming that we're taking about the kinds of hypotheses that would ever have an anecdote offered in support of it.) Therefore, the learning of the existence of anecdotes probably won't move much probability around among the hypotheses being seriously debated.

However, hypothesis space is vast. Many hypotheses have never even been brought up for debate. The overwhelming majority should never come to our attention at all.

In particular, hypothesis space contains hypotheses for which no anecdote has ever been offered. If you learned that a particular hypothesis H were true, you would increase your probability that H was among those hypotheses that are supported by anecdotes. (Right? The alternative is that which hypotheses get anecdotes is determined by mechanisms that have absolutely no correlation, or even negative correlation, with the truth.) Therefore, the existence of an anecdote is evidence for the hypothesis that the anecdote alleges is true.

Comment author: Anatoly_Vorobey 11 August 2013 06:52:14AM 13 points [-]

A typical situation is that there's a contentious issue, and some anecdotes reach your attention that support one of the competing hypotheses.

You have three ways to respond:

  1. You can under-update your belief in the hypothesis, ignoring the anecdotes completely
  2. You can update by precisely the measure warranted by the existence of these anecdotes and the fact that they reached you.
  3. You can over-update by adding too much credence to the hypothesis.

In almost every situation you're likely to encounter, the real danger is 3. Well-known biases are at work pulling you towards 3. These biases are often known to work even when you're aware of them and trying to counteract them. Moreover, the harm from reaching 3 is typically far greater than the harm from reaching 1. This is because the correct added amount of credence in 2 is very tiny, particularly because you're already likely to know that the competing hypotheses for this issue are all likely to have anecdotes going for them. In real-life situations, you don't usually hear anecdotes supporting an incredibly unlikely-seeming hypothesis which you'd otherwise be inclined to think as capable of nurturing no anecdotes at all. So forgoing that tiny amount of credence is not nearly as bad as choosing 3 and updating, typically, by a large amount.

The saying "The plural of anecdotes is not data" exists to steer you away from 3. It works to counteract the very strong biases pulling you towards 3. Its danger, you are saying, is that it pulls you towards 1 rather than the correct 2. That may be pedantically correct, but is a very poor reason to criticize the saying. Even with its help, you're almost always very likely to over-update - all it's doing is lessening the blow.

Perhaps this as an example of "things Bayesianism has taught you" that are harming your epistemic rationality?

A similar thing I noticed is disdain towards "correlation does not imply causation" from enlightened Bayesians. It is counter-productive.

Comment author: Tyrrell_McAllister 11 August 2013 04:23:10PM *  9 points [-]

These biases are often known to work even when you're aware of them and trying to counteract them.

This is the problem. I know, as an epistemic matter of fact, that anecdotes are evidence. I could try to ignore this knowledge, with the goal of counteracting the biases to which you refer. That is, I could try to suppress the Bayesian update or to undo it after it has happened. I could try to push my credence back to where it was "manually". However, as you point out, counteracting biases in this way doesn't work.

Far better, it seems to me, to habituate myself to the fact that updates can by miniscule. Credence is quantitative, not qualitative, and so can change by arbitrarily small amounts. "Update Yourself Incrementally". Granting that someone has evidence for their claims can be an arbitrarily small concession. Updating on the evidence doesn't need to move my credences by even a subjectively discernible amount. Nonetheless, I am obliged to acknowledge that the anecdote would move the credences of an ideal Bayesian agent by some nonzero amount.

Comment author: Lumifer 12 August 2013 05:20:10PM *  2 points [-]

...updates can by miniscule ... Updating on the evidence doesn't need to move my credences by even a subjectively discernible amount. Nonetheless, I am obliged to acknowledge that the anecdote would move the credences of an ideal Bayesian agent by some nonzero amount.

So, let's talk about measurement and detection.

Presumably you don't calculate your believed probabilities to the n-th significant digit, so I don't understand the idea of a "miniscule" update. If it has no discernible consequences then as far as I am concerned it did not happen.

Let's take an example. I believe that my probability of being struck by lightning is very low to the extent that I don't worry about it and don't take any special precautions during thunderstorms. Here is an anecdote which relates how a guy was stuck by lightning while sitting in his office inside a building. You're saying I should update my beliefs, but what does it mean?

I have no numeric estimate of P(me being struck by lightning) so there's no number I can adjust by 0.0000001. I am not going to do anything differently. My estimate of my chances to be electrocuted by Zeus' bolt is still "very very low". So where is that "miniscule update" that you think I should make and how do I detect it?

P.S. If you want to update on each piece of evidence, surely by now you must fully believe that product X is certain to enlarge your penis?

Comment author: 9eB1 12 August 2013 04:28:29AM *  5 points [-]

A typical situation is that there's a contentious issue, and some anecdotes reach your attention that support one of the competing hypotheses.

It is interesting that you think of this as typical, or at least typical enough to be exclusionary of non-contentious issues. I avoid discussions about politics and possibly other contentious issues, and when I think of people providing anecdotes I usually think of them in support of neutral issues, like the efficacy of understudied nutritional supplements. If someone tells you, "I ate dinner at Joe's Crab Shack and I had intense gastrointestinal distress," I wouldn't think it's necessarily justified to ignore it on the basis that it's anecdotal. If you have 3 more friends who all report the same thing to you, you should rightly become very suspicious of the sanitation at Joe's Crab Shack. I think the fact that you are talking about contentious issues specifically is an important and interesting point of clarification.

Comment author: cousin_it 11 August 2013 01:02:26PM *  3 points [-]

Thanks for that comment! Eliezer often says people should be more sensitive to evidence, but an awful lot of real-life evidence is in fact much weaker, noisier, and easier to misinterpret than it seems. And it's not enough to just keep in mind a bunch of Bayesian mantras - you need to be aware of survivor bias, publication bias, Simpson's paradox and many other non-obvious traps, otherwise you silently go wrong and don't even know it. In a world where most published medical results fail to replicate, how much should we trust our own conclusions?

Would it be more honest to recommend people to just never update at all? But then everyone will stick to their favorite theories forever... Maybe an even better recommendation would be to watch out for "motivated cognition", try to be more skeptical of all theories including your favorites.

Comment author: Lumifer 12 August 2013 05:10:01PM *  1 point [-]

The alternative is that which hypotheses get anecdotes is determined by mechanisms that have absolutely no correlation, or even negative correlation, with the truth.

Doesn't look implausible to me. Here's an alternative hypothesis: the existence of anecdotes is a function of which beliefs are least supported by strong data because such beliefs need anecdotes for justification.

In general, I think anecdotes are way too filtered and too biased as an information source to be considered serious evidence. In particular, there's a real danger of treating a lot of biased anecdotes as conclusive data and that danger, seems to me, outweighs the miniscule usefulness of anecdotes.

Comment author: Tyrrell_McAllister 13 August 2013 10:06:32PM 1 point [-]

In general, I think anecdotes are way too filtered and too biased as an information source to be considered serious evidence.

We may agree. It depends on what work the word "serious" is doing in the quoted sentence.

Comment author: Lumifer 14 August 2013 01:11:29AM 0 points [-]

In this context "serious" = "I'm willing to pay attention to it".

Comment author: Watercressed 11 August 2013 04:53:42AM *  0 points [-]

I would raise a hypothesis to consideration because someone was arguing for it, but I don't think anecdotes are good evidence in that I would have similar confidence in a hypothesis supported by an anecdote, and a hypothesis that is flatly stated with no justification. The evidence to raise it to consideration comes from the fact that someone took the time to advocate it.

This is more of a heuristic than a rule, because there are anecdotes that are strong evidence ("I ran experiments on this last year and they didn't fit"), but when dealing with murkier issues, they don't count for much.

Comment author: Tyrrell_McAllister 11 August 2013 05:13:34AM 2 points [-]

The evidence to raise it to consideration comes from the fact that someone took the time to advocate it, not the anecdote.

Yes, it may be that the mere fact that a hypothesis is advocated screens off whether that hypothesis is also supported by an anecdote. But I suspect that the existence of anecdotes still moves a little probability mass around, even among just those hypotheses that are being advocated.

I mean, if someone advocated for a hypothesis, and they couldn't even offer an anecdote in support of it, that would be pretty deadly to their credibility. So, unless I am certain that every advocated hypothesis has supporting anecdotes (which I am not), I must concede that anecdotes are evidence, howsoever weak, over and above mere advocacy.

Comment author: Watercressed 11 August 2013 03:48:50PM 2 points [-]

Here's a situation where an anecdote should reduce our confidence in a belief:

  • A person's beliefs are usually well-supported.
  • When he offers supporting evidence, he usually offers the strongest evidence he knows about.

If this person were to offer an anecdote, it should reduce our confidence in his proposition, because it makes it unlikely he knows of stronger supporting evidence.

I don't know how applicable this is to actual people.

Comment author: JoshuaZ 11 August 2013 07:19:28PM *  1 point [-]

I don't think this is necessarily valid, because people also know that anecdotes can be highly persuasive. So for many people, if you have an anecdote it will make sense to say so, since most people argue not to reach the truth but to persuade.

Comment author: Tyrrell_McAllister 11 August 2013 04:26:43PM 1 point [-]

I agree that it is at least hypothetically possible that the offering of an anecdote should reduce our credence in what the anecdote claims.

Comment author: Tyrrell_McAllister 13 August 2013 10:10:24PM 2 points [-]

... For example, if you told me that you once met a powerful demon who works to stop anyone from ever telling anecdotes about him (regardless of whether the anecdotes are true or false), then I would decrease my credence in the existence of such a demon.