My intuition says presenting bad facts or pieces of reasoning is wrong, but withholding good facts or pieces of reasoning is less wrong. I assume most of you agree.
This is a puzzle, because on the face of it, the effect is the same.
Suppose the Walrus and the Carpenter are talking of whether pigs have wings.
Scenario 1: The Carpenter is 80% sure that pigs have wings, but the Walrus wants him to believe that they don't. So the Walrus claims that it's a deep principle of evolution theory that no animal can have wings, and the Carpenter updates to 60%.
Scenario 2: The Carpenter is 60% sure that pigs have wings, and the Walrus wants him to believe that they don't. So the Walrus neglects to mention that he once saw a picture of a winged pig in a book. Learning this would cause the Carpenter to update to 80%, but he doesn't learn this, so he stays at 60%.
In both scenarios, the Walrus chose for the Carpenter's probability to be 60% when he could have chosen for it to be 80%. So what's the difference?
If there isn't any, then we're forced to claim bias (maybe omission bias), which we can then try to overcome.
But in this post I want to try rationalizing the asymmetry. I don't feel that my thinking here is clear, so this is very tentative.
If a man is starving, not giving him a loaf of bread is as deadly as giving him cyanide. But if there are a lot of random objects lying around in the neighborhood, the former deed is less deadly: it's far more likely that one of the random objects is a loaf of bread than that it is an antidote to cyanide.
I believe that, likewise, it is more probable that you'll randomly find a good argument duplicated (conditioning on it makes some future evidence redundant), than that you'll randomly find a bad argument debunked (conditioning on it makes some future counter-evidence relevant). In other words, whether you're uninformed or misinformed, you're equally mistaken; but in an environment where evidence is not independent, it's normally easier to recover from being uninformed than from being misinformed.
The case becomes stronger when you think of it in terms of boundedly rational agents fishing from a common meme pool. If agents can remember or hold in mind fewer pieces of information than they are likely to encounter, pieces of disinformation floating in the pool not only do damage by themselves, but do further damage by displacing pieces of good information.
These are not the only asymmetries. A banal one is that misinforming takes effort and not informing saves effort. And if you're caught misinforming, that makes you look far worse than if you're caught not informing. (But the question is why this should be so. Part of it is that, usually, there are plausible explanations other than bad faith for why one might not inform -- if not, it's called "lying by omission" -- but no such explanations for why one might misinform.) And no doubt there are yet others.
But I think a major part of it has to be that ignorance heals better than confusion when placed in a bigger pool of evidence. Do you agree? Do you think "lies" are worse than "secrets", and if so, why?
The main difference is the action and effort involved, but other differences are the possible intent involved, and consequences. Very few moral systems demand action, except perhaps in very limited circumstances, while every moral system I know forbids some actions. The difference is quite obvious: requiring positive action will consume your entire life, while forbidding negative action leaves people free to do as they like, for the most part, and is also far easier to enforce.
There are circumstances where keeping secrets is roughly equivalent to lying in people's minds. For example, as a witness in court, when being advised by an expert to do something that might have negative consequences (even if they deem it the best course of action), if there's something obviously wrong but easily correctable (eg about your appearance, or you forgot your purse) you'd expect your friends to correct you and would be angry if you thought they noticed and said nothing. These seem to have in common that 1) there is clear information asymmetry, and 2) someone has been singled out.
I would also like to point out that proper lying by omission usually involves more effort than plain lying (if your objective was to change someone's beliefs), because it will require a whole lot of true statements and misinterpretable statements to have the same level of effect as a blatant lie. And it would take even more effort to do this in a way such that it wouldn't be obvious what you did if your mark found out the truth.