In lists of logical fallacies, you will find included “the genetic fallacy”—the fallacy of attacking a belief based on someone’s causes for believing it.

    This is, at first sight, a very strange idea—if the causes of a belief do not determine its systematic reliability, what does? If Deep Blue advises us of a chess move, we trust it based on our understanding of the code that searches the game tree, being unable to evaluate the actual game tree ourselves. What could license any probability assignment as “rational,” except that it was produced by some systematically reliable process?

    Articles on the genetic fallacy will tell you that genetic reasoning is not always a fallacy—that the origin of evidence can be relevant to its evaluation, as in the case of a trusted expert. But other times, say the articles, it is a fallacy; the chemist Kekulé first saw the ring structure of benzene in a dream, but this doesn’t mean we can never trust this belief.

    So sometimes the genetic fallacy is a fallacy, and sometimes it’s not?

    The genetic fallacy is formally a fallacy, because the original cause of a belief is not the same as its current justificational status, the sum of all the support and antisupport currently known.

    Yet we change our minds less often than we think. Genetic accusations have a force among humans that they would not have among ideal Bayesians.

    Clearing your mind is a powerful heuristic when you’re faced with new suspicion that many of your ideas may have come from a flawed source.

    Once an idea gets into our heads, it’s not always easy for evidence to root it out. Consider all the people out there who grew up believing in the Bible; later came to reject (on a deliberate level) the idea that the Bible was written by the hand of God; and who nonetheless think that the Bible is full of indispensable ethical wisdom. They have failed to clear their minds; they could do significantly better by doubting anything the Bible said because the Bible said it.

    At the same time, they would have to bear firmly in mind the principle that reversed stupidity is not intelligence; the goal is to genuinely shake your mind loose and do independent thinking, not to negate the Bible and let that be your algorithm.

    Once an idea gets into your head, you tend to find support for it everywhere you look—and so when the original source is suddenly cast into suspicion, you would be very wise indeed to suspect all the leaves that originally grew on that branch . . .

    If you can! It’s not easy to clear your mind. It takes a convulsive effort to actually reconsider, instead of letting your mind fall into the pattern of rehearsing cached arguments. “It ain’t a true crisis of faith unless things could just as easily go either way,” said Thor Shenkel.

    You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right—the Bible being the obvious archetypal example.

    On the other hand . . . there’s such a thing as sufficiently clear-cut evidence, that it no longer significantly matters where the idea originally came from. Accumulating that kind of clear-cut evidence is what Science is all about. It doesn’t matter any more that Kekulé first saw the ring structure of benzene in a dream—it wouldn’t matter if we’d found the hypothesis to test by generating random computer images, or from a spiritualist revealed as a fraud, or even from the Bible. The ring structure of benzene is pinned down by enough experimental evidence to make the source of the suggestion irrelevant.

    In the absence of such clear-cut evidence, then you do need to pay attention to the original sources of ideas—to give experts more credence than layfolk, if their field has earned respect—to suspect ideas you originally got from suspicious sources—to distrust those whose motives are untrustworthy, if they cannot present arguments independent of their own authority.

    The genetic fallacy is a fallacy when there exist justifications beyond the genetic fact asserted, but the genetic accusation is presented as if it settled the issue. Hal Finney suggests that we call correctly appealing to a claim’s origins “the genetic heuristic.”1

    Some good rules of thumb (for humans):

    • Be suspicious of genetic accusations against beliefs that you dislike, especially if the proponent claims justifications beyond the simple authority of a speaker. “Flight is a religious idea, so the Wright Brothers must be liars” is one of the classically given examples.
    • By the same token, don’t think you can get good information about a technical issue just by sagely psychoanalyzing the personalities involved and their flawed motives. If technical arguments exist, they get priority.
    • When new suspicion is cast on one of your fundamental sources, you really should doubt all the branches and leaves that grew from that root. You are not licensed to reject them outright as conclusions, because reversed stupidity is not intelligence, but . . .
    • Be extremely suspicious if you find that you still believe the early suggestions of a source you later rejected.

    1Source: http://lesswrong.com/lw/s3/the_genetic_fallacy/lls.

    New to LessWrong?

    New Comment
    18 comments, sorted by Click to highlight new comments since: Today at 4:51 AM

    "later came to reject (on a deliberate level) the idea that the Bible was not written by the hand of God

    Don't you mean "was written by..." here?

    In a practical sense, the genetic fallacy isn't necessarily a fallacy for two reasons, as far as I can discern. First, because there are too many things to know, it's impossible to verify everything to the extent that experts in the field have. I couldn't tell you how scientists know benzene has a ring structure, much less replicate the experiment for myself. I could, but I both find there are more interesting things to learn, and that, knowing something about the scientific method and how scientists are certain about it, I'm comfortable making an appeal to authority and a genetic argument (not from its original source, but the origin of the belief of the countless chemists who do know the structure of benzene).

    The other is that belief does have to be explained. The fact that millions of people, and some very intelligent ones, believe that the Bible is the word of God is not trivial. In fact, it cries out for explanation, and being in the minority, you have to consider that maybe you're the one who's wrong. Of course, if you consider that humans are extremely biased animals, and that religion touches on quite a few of them, then this appeal to majority doesn't sound very compelling at all--but that's once you have an explanation for it. Without it, I might always suspect that I'm the crank who thinks he's disproved General Relativity.

    Ideally, neither of these lines of thought would be necessary, but in practice, the causes of belief are quite relevant.

    One thing to be aware of when considering logical fallacies is that there are two ways in which people consider something to be a fallacy. On the strict account, it is a form of argumentation that doesn't rule out all cases in which the conclusion is false. Appeals to authority and considerations of the history of a claim are obviously fallacious in this sense. The loose account is a form of argumentation that is deeply flawed. It is in this sense that appeal to authority and considerations of the history of a claim may not be fallacious, for they sometimes give us some useful reasons to believe or disbelieve in the claim. Certain considerations don't give deductive (logical) validity, but do give Bayesian support.

    Confusing - now the central question of rationality is no longer "why do you believe what you believe?"

    Eliezer, I think the point you've made here generalizes to several things in the standard fallacy lists, usually which take the form:

    X Fallacy: Believing Y because of Z, when Z doesn't ABSOLUTELY GUARANTEE Y.

    ...even though, it turns out, Z should raise the probability you assign to Y.

    For example:

    Appeal to authority: An expert in the field believing something within that field doesn't guarantee its truth, but is strong evidence.

    Argument from ignorance: The fact that you haven't heard of any good arguments for X, doesn't mean X is necessary false, but if most of humanity has conducted a motivated search for it and come up lacking, and you've checked all such justifications, that's strong evidence against X.

    I generally agree here, but I think it gives too little benefit to genetic reasoning.

    For example, I sometimes listen to Neal Bortz when driving, due to the channel already being set when I started the car. One day he suddenly started going on and on about drilling for oil off the coast and in Alaska. This was at the exact time the Mcain campaign and Republicans in general started a coordinated effort to push this issue, probably to play election politics with oil prices.

    Anyway, Bortz has lots of reasonable arguments to support his claim that we should be drilling, and there is a pretty strong case to be made for it in general, and he doesn't use underhanded arguments about oil prices, and he admits it will be 10 years before the oil starts to flow -- in other words, he is not deceptive. However, he does not give any fair analysis of the arguments against drilling (probably due to not fully understanding them).

    What I'm saying is that, if your goal is to set some rules of thumb to help you best find the truth despite mental biases, you should discount any argument that seems to come from some sort of sales pitch, even if it is well documented and researched, with supporting evidence. The rule is not easy to state succinctly, but it is basically: "You have to heavily discount any argument made by a group who will make money if they can successfully persuade people." Notice that the rule makes no mention of the quality of the evidence! That is because no evidence can be trusted if the source is biased, even if that source has no dishonest intentions.

    Hypothetical example: A scientist working for Pharma is testing the safety of a potential drug. The thing most likely to derail the drug is side effect X. The scientist and Pharma work very diligently and prove that side effect X is not associated with this drug. However, because the research was oriented at proving the drug safe, vs determining it's safety, almost all the brain-hours went toward thinking about things like "how do you control this experiment to ensure that such and such is controlled for" and not thinking about other safety issues. Perhaps the pills then cause some unforeseen side effect, while not causing any that were considered at issue.

    In that example, everyone acted honestly, but the research cannot be accepted with as much weight as independent testing, because there is unavoidable bias.

    I'm very strict about this. I only accept claims that come out of science. I have a narrow definition of science based on lineage: you have to be able to trace it back to settled physics. Physics, chemistry, biochemistry, biology, molecular biology, neural biology, etc, all have strict lines of descent. Much of theoretical psychology, on the other hand (to give an example), does not; it's ab initio theorizing. Anything that is not science (so narrowly defined) I take to be noise. Systematic and flagrant abuse of the "genetic fallacy" is probably the quickest way to truth.

    I have to largely agree with Poke here, although I'd be broader in my acceptance of what constitutes science. Too many times have I found ideas to be false which were rejected by mainstream science, but which seemed plausible and for which I was exposed to forceful advocacy (for example, that AIDS is not caused by the HIV virus). This makes me tend to doubt any and all ideas which come from a spirit of skepticism towards mainstream science. A more current example would be global warming skepticism, in all its many variants.

    Instead of calling it the "genetic fallacy", maybe we should re-christen it the "genetic heuristic".

    poke, what do you think of IQ? isn't that ab initio theorizing, with poor foundations? That's certainly a reason to doubt that IQ is well-understood, eg, that a single g factor is so important, but are you saying that you reject the validity of predictions based on IQ? If I correctly understand your definition of science, it radically diverges from most people's usage which would include IQ.

    Douglas Knight, I'm not sure what predictions you're referring to. Statistical methods have a good pedigree. I take a correlation to be a correlation and try not to overinterpret it.

    "This makes me tend to doubt any and all ideas which come from a spirit of skepticism towards mainstream science. A more current example would be global warming skepticism, in all its many variants."

    So what are you doing to overcome this bias?

    Also, if you do happen to arrive at the right answer using a suspect methodology, it might be a good idea to inspect this methodology, since it's likely doing something correctly.

    It is for this reason that Robert Aumann, super-smart as he is, should be entirely ignored when he opines about the Palestinian-Israeli conflict.

    http://www.overcomingbias.com/2007/07/theyre-telling-.html

    I actually had a long-time close friend who turned out to be something of a compulsive liar.

    These days I place almost zero weight on him having stated anything as fact... But a strange thing happened: Some of these things entered my head as cached thoughts and I didn't remember their source at all - just that they were "true". This has managed to cause me some embarrassment in the past.

    These days I often take care to mention what my source is for claims I make, so it's no longer a problem... But the general principle remains that most people don't always know where their "knowledge" came from.

    [+][anonymous]1y-170

    It's like in programming, objects pointing to different versions of the same parent object—because our subconscious software cloned the parent object and not referred to a single copy of it. And now we have some unreviewed code and "belief leaks" (by the analogy with memory leaks).