Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
My intuition says presenting bad facts or pieces of reasoning is wrong, but withholding good facts or pieces of reasoning is less wrong. I assume most of you agree.
This is a puzzle, because on the face of it, the effect is the same.
Suppose the Walrus and the Carpenter are talking of whether pigs have wings.
Scenario 1: The Carpenter is 80% sure that pigs have wings, but the Walrus wants him to believe that they don't. So the Walrus claims that it's a deep principle of evolution theory that no animal can have wings, and the Carpenter updates to 60%.
Scenario 2: The Carpenter is 60% sure that pigs have wings, and the Walrus wants him to believe that they don't. So the Walrus neglects to mention that he once saw a picture of a winged pig in a book. Learning this would cause the Carpenter to update to 80%, but he doesn't learn this, so he stays at 60%.
In both scenarios, the Walrus chose for the Carpenter's probability to be 60% when he could have chosen for it to be 80%. So what's the difference?
If there isn't any, then we're forced to claim bias (maybe omission bias), which we can then try to overcome.
But in this post I want to try rationalizing the asymmetry. I don't feel that my thinking here is clear, so this is very tentative.
If a man is starving, not giving him a loaf of bread is as deadly as giving him cyanide. But if there are a lot of random objects lying around in the neighborhood, the former deed is less deadly: it's far more likely that one of the random objects is a loaf of bread than that it is an antidote to cyanide.
I believe that, likewise, it is more probable that you'll randomly find a good argument duplicated (conditioning on it makes some future evidence redundant), than that you'll randomly find a bad argument debunked (conditioning on it makes some future counter-evidence relevant). In other words, whether you're uninformed or misinformed, you're equally mistaken; but in an environment where evidence is not independent, it's normally easier to recover from being uninformed than from being misinformed.
The case becomes stronger when you think of it in terms of boundedly rational agents fishing from a common meme pool. If agents can remember or hold in mind fewer pieces of information than they are likely to encounter, pieces of disinformation floating in the pool not only do damage by themselves, but do further damage by displacing pieces of good information.
These are not the only asymmetries. A banal one is that misinforming takes effort and not informing saves effort. And if you're caught misinforming, that makes you look far worse than if you're caught not informing. (But the question is why this should be so. Part of it is that, usually, there are plausible explanations other than bad faith for why one might not inform -- if not, it's called "lying by omission" -- but no such explanations for why one might misinform.) And no doubt there are yet others.
But I think a major part of it has to be that ignorance heals better than confusion when placed in a bigger pool of evidence. Do you agree? Do you think "lies" are worse than "secrets", and if so, why?