My intuition says presenting bad facts or pieces of reasoning is wrong, but withholding good facts or pieces of reasoning is less wrong. I assume most of you agree.
This is a puzzle, because on the face of it, the effect is the same.
Suppose the Walrus and the Carpenter are talking of whether pigs have wings.
Scenario 1: The Carpenter is 80% sure that pigs have wings, but the Walrus wants him to believe that they don't. So the Walrus claims that it's a deep principle of evolution theory that no animal can have wings, and the Carpenter updates to 60%.
Scenario 2: The Carpenter is 60% sure that pigs have wings, and the Walrus wants him to believe that they don't. So the Walrus neglects to mention that he once saw a picture of a winged pig in a book. Learning this would cause the Carpenter to update to 80%, but he doesn't learn this, so he stays at 60%.
In both scenarios, the Walrus chose for the Carpenter's probability to be 60% when he could have chosen for it to be 80%. So what's the difference?
If there isn't any, then we're forced to claim bias (maybe omission bias), which we can then try to overcome.
But in this post I want to try rationalizing the asymmetry. I don't feel that my thinking here is clear, so this is very tentative.
If a man is starving, not giving him a loaf of bread is as deadly as giving him cyanide. But if there are a lot of random objects lying around in the neighborhood, the former deed is less deadly: it's far more likely that one of the random objects is a loaf of bread than that it is an antidote to cyanide.
I believe that, likewise, it is more probable that you'll randomly find a good argument duplicated (conditioning on it makes some future evidence redundant), than that you'll randomly find a bad argument debunked (conditioning on it makes some future counter-evidence relevant). In other words, whether you're uninformed or misinformed, you're equally mistaken; but in an environment where evidence is not independent, it's normally easier to recover from being uninformed than from being misinformed.
The case becomes stronger when you think of it in terms of boundedly rational agents fishing from a common meme pool. If agents can remember or hold in mind fewer pieces of information than they are likely to encounter, pieces of disinformation floating in the pool not only do damage by themselves, but do further damage by displacing pieces of good information.
These are not the only asymmetries. A banal one is that misinforming takes effort and not informing saves effort. And if you're caught misinforming, that makes you look far worse than if you're caught not informing. (But the question is why this should be so. Part of it is that, usually, there are plausible explanations other than bad faith for why one might not inform -- if not, it's called "lying by omission" -- but no such explanations for why one might misinform.) And no doubt there are yet others.
But I think a major part of it has to be that ignorance heals better than confusion when placed in a bigger pool of evidence. Do you agree? Do you think "lies" are worse than "secrets", and if so, why?
I don't think that the "effort" distinction is banal at all.
The "lying" scenario provides us with much more information about the "liar", than the "keeping secrets" scenario provides us about the "secret keeper". Let me go into this in more detail.
An individual assumes that others have mental states, but that individual has no direct access to those mental states. An individual can only infer mental states through the physical actions of another.
For now, let's assume that an individual who can more accurately infer others mental states from their actions will be "happier" or "more successful" than an individual who cannot.
So, given this assumption, every individual has an incentive to constantly determine others mental states, generalize this into some mental stance, and relate that mental state and mental stance back to the individual.
With these brief preliminaries out of the way, let's examine "lying" vs "secrets".
When a person gives you misinformation, the potential liar takes an active role in trying to affect you negatively. The range of potential mental states and mental stances from this information is relatively small. The person can have a mental stance of "looking out for your best interests" (let's call this mental stance "friendliness") and be mistaken, or the person can have a mental stance of "trying to manipulate you" and be lying. The pathway to determine whether a person is "mistaken" or "lying" is relatively straightforward (compared to secrets), and if we can determine "lying" we can take action to change our relationship with the other.
When a person withholds information that may be helpful; however, we have a much stickier situation. The range of potential mental states is much broader in this situation. The person may be unsure of the accuracy of the information. The person may be unsure of the efficacy of the information to you. The person may be unsure of your willingness to receive this information. In other words, there are many reasons a person may refrain from giving you potentially helpful information and still have a mental stance of "friendliness".
And it would be hard to prove that the withholder of information actually has a mental stance of "eneminess".
Thus, when someone withholds information, our line of inquiry and our course of actions are far less clear than when a person gives us misinformation.
So, in summary, the asymmetry between the two situations is an asymmetry of information. The fact than an individual takes an effort to "lie" to us give us a great deal more information about that individual's mental stance towards us. The person who "keeps a secret" on the other hand, has not given us information about their mental stance towards us.
Hope this helps provoke discussion.
David
----The person may be unsure of your willingness to receive this information. In other words, there are many reasons a person may refrain from giving you potentially helpful information and still have a mental stance of "friendliness".----
I agree. For instance, if you know that people would over-value your evidence. For instance, what if Walrus believes that Carpenter is over-credulous. He thinks that Carpenter will take the evidence, proclaim 100% certainty that pigs can have wings, and go blow all his money trying to start a flying pig farm. Wa... (read more)