Saying that X asserting A provides very weak evidence for A should not be confused with saying Y asserting B provides evidence for not-B. One claim is about magnitude while the other is about sign. In most situations, the latter commits one to violating conservation of evidence, but the former does not.
There are circumstances where X asserting A provides evidence against A (for not-A). Some speakers are less reliable than others and there is no necessary reason a speaker can't be so unreliable that her claims provide zero evidence for or against what she asserts. Moreover, there is no reason a speaker can't be anti-reliable. Perhaps she is a pathological liar. In this case her statements are inversely correlated with the truth and an assertion of A should be taken as evidence for not-A. As long as your math is right there is no reason for this to violate conservation of evidence.
In his recent CATO article, Reversed Stupidity Is Not Intelligence, Eliezer writes:
I think the statement makes a correct point - don't dismiss an idea just because a few proponents are stupid - but is too strong as written. In some cases, we can derive information about the truth of a proposition by psychoanalyzing reasons for believing it.
There are certain propositions that people are likely to assert regardless of whether or not they are true. Maybe they're useful for status disputes, or part of a community membership test, or just synchronize well with particula human biases. "X proves the existence of God" commonly gets asserted whether or not X actually proves the existence of God. Anything that supports one race, gender, political party, or ideology over another is also suspect. Let's call these sorts of propositions "popular claims". Some true propositions might be popular claims, but popular claims are popular whether or not they are true.
Some popular claims are surprising. Without knowing anything about modern society, one might not predict that diluting chemicals thousands of times to cure diseases, or claiming the government is hiding alien bodies, would be common failure modes. You don't know these are popular claims until you hear them.
If a very large group of people make a certain assertion, and you always find it to be false, you now have very good evidence that it's a popular claim, a proposition that people will very often assert even if it's false.
Normally, when someone asserts a proposition, you assume they have good evidence for it - in Bayesian terms, the probability that they would assert it is higher if there is evidence than if there is not evidence. But people assert popular claims very often even when there is no evidence for them, so someone asserting a popular claim provides no (or little) evidence for it, leaving you back at whatever its prior is.
Time for an example: suppose two respected archaeologists (who happen to be Mormon) publish two papers on the same day. The first archaeologist claims to have found evidence that Native Americans are descended from ancient Israelites. The second archaeologist claims to have found evidence that Zulus are descended from Australian aborigines.
On the face of it, these two claims are about equally crazy-sounding. But I would be much more likely to pay attention to the claim that the Zulus are descended from aborigines. I know that the Mormons have a bias in favor of believing Indians are descended from Israelites, and probably whatever new evidence the archaeologist thinks she's found was just motivated by this same bias. But no one believes Zulus are descended from Australians. If someone claims they are, she must have some new and interesting reason to think so.
(to put it another way, we expect a Mormon to privilege the hypothesis of Israelite descent; her religion has already picked it out of hypothesis-space. We don't expect a Mormon to privilege the hypothesis of Australian descent, so it's more likely that she came to it honestly).
If then I were to learn that there was a large community of Mormons who interpreted their scripture to say that Zulus were descended from Australians, I would consider it much more likely that the second archaeologist was also just parroting a religious bias, and I would no longer be quite as interested in reading her paper.
In this case, reversed stupidity is intelligence - learning that many people believed in an Australian-Zulu connection for religious reasons decreases my probability that the posited Australian-Zulu connection is real. I can never go lower than my whatever my prior for an Australian - Zulu connection would be, but I can discount a lot of the evidence that might otherwise take me above my prior.
So in summary, a proposition asserted for stupid reasons can raise your probability that it is the sort of proposition that people assert for stupid reasons, which in turn can lower your probability that the next person to assert it will have smart reasons for doing so. Reversed stupidity can never bring the probability of an idea lower than its prior, but it can help you discount evidence that would otherwise bring it higher.