Generally, if you're given evidence for something, the evidence-giver is trying to convince you of that something. If you're given only weak evidence, that itself is evidence that there is no strong evidence (if there is strong evidence, why didn't they tell you that instead?), and so in some circumstances it could be rational to downgrade your probability estimate.
Sure, this makes perfect sense in a political environment - or in the ancestral environment, where I'm sure this kind of thing was very important to breeding (I could even take a shot at an evolutionary argument for this kind of instinct!). But that instinct is a net positive only in political situations; our current environment is significantly more factual-uncertainty based than political-uncertainty based. This may make the instinct a net negative.
Is that true? Surely even on a purely factual matter, it is still the case that he who makes a claim, will typically give his best evidence for the claim, so if the best evidence offered is weak, that still suggests stronger evidence doesn't exist.
that he who makes a claim, will typically give his best evidence for the claim, so if the best evidence offered is weak,
If a person is making a claim to you and knowing whether this claim is right or wrong is important, things are already pretty political! I was thinking of a scientific study providing weak evidence in favour of something, and this heuristic hurting our estimates.
Also in this case, we have evidence that there is only token support in congress for public measures to improve adoption. I'm kind of surprised the control found this evidence to be net positive really. And I wonder if the evidence gets evaluated a little different when people have to use it rather than just evaluate it.
When I read the title I had expected that this was the point of the post. Perhaps because I've been intending to write a post to that effect for the last three years or so.
So I agree completely.
Eliezer's post What evidence filtered evidence deals with this idea.
Bayesians should update not just on the signs and portents that a person has reported to them, but also take into account the chain of cause and effect that led the person to report that evidence. So the paper is wrong and what Khoth said is sensible.
This actually feels like a variant on overjustification. In the absence of evidence, people are content with their belief on whatever intuitive basis led them to adopt it in the first place. Provided weak evidence from an external source, they consciously reach for a better reason to believe and don't find one.
Article: Weak supporting evidence can undermine belief in an outcome
Paper: When good evidence goes bad: The weak evidence effect in judgment and decision-making
Abstract: