If you once tell a lie, the truth is ever after your enemy.
I have discussed the notion that lies are contagious. If you pick up a pebble from the driveway, and tell a geologist that you found it on a beach—well, do you know what a geologist knows about rocks? I don’t. But I can suspect that a water-worn pebble wouldn’t look like a droplet of frozen lava from a volcanic eruption. Do you know where the pebble in your driveway really came from? Things bear the marks of their places in a lawful universe; in that web, a lie is out of place.1
What sounds like an arbitrary truth to one mind—one that could easily be replaced by a plausible lie—might be nailed down by a dozen linkages to the eyes of greater knowledge. To a creationist, the idea that life was shaped by “intelligent design” instead of “natural selection” might sound like a sports team to cheer for. To a biologist, plausibly arguing that an organism was intelligently designed would require lying about almost every facet of the organism. To plausibly argue that “humans” were intelligently designed, you’d have to lie about the design of the human retina, the architecture of the human brain, the proteins bound together by weak van der Waals forces instead of strong covalent bonds . . .
Or you could just lie about evolutionary theory, which is the path taken by most creationists. Instead of lying about the connected nodes in the network, they lie about the general laws governing the links.
And then to cover that up, they lie about the rules of science—like what it means to call something a “theory,” or what it means for a scientist to say that they are not absolutely certain.
So they pass from lying about specific facts, to lying about general laws, to lying about the rules of reasoning. To lie about whether humans evolved, you must lie about evolution; and then you have to lie about the rules of science that constrain our understanding of evolution.
But how else? Just as a human would be out of place in a community of actually intelligently designed life forms, and you have to lie about the rules of evolution to make it appear otherwise, so too beliefs about creationism are themselves out of place in science—you wouldn’t find them in a well-ordered mind any more than you’d find palm trees growing on a glacier. And so you have to disrupt the barriers that would forbid them.
Which brings us to the case of self-deception.
A single lie you tell yourself may seem plausible enough, when you don’t know any of the rules governing thoughts, or even that there are rules; and the choice seems as arbitrary as choosing a flavor of ice cream, as isolated as a pebble on the shore . . .
. . . but then someone calls you on your belief, using the rules of reasoning that they’ve learned. They say, “Where’s your evidence?”
And you say, “What? Why do I need evidence?”
So they say, “In general, beliefs require evidence.”
This argument, clearly, is a soldier fighting on the other side, which you must defeat. So you say: “I disagree! Not all beliefs require evidence. In particular, beliefs about dragons don’t require evidence. When it comes to dragons, you’re allowed to believe anything you like. So I don’t need evidence to believe there’s a dragon in my garage.”
And the one says, “Eh? You can’t just exclude dragons like that. There’s a reason for the rule that beliefs require evidence. To draw a correct map of the city, you have to walk through the streets and make lines on paper that correspond to what you see. That’s not an arbitrary legal requirement—if you sit in your living room and draw lines on the paper at random, the map’s going to be wrong. With extremely high probability. That’s as true of a map of a dragon as it is of anything.”
So now this, the explanation of why beliefs require evidence, is also an opposing soldier. So you say: “Wrong with extremely high probability? Then there’s still a chance, right? I don’t have to believe if it’s not absolutely certain.”
Or maybe you even begin to suspect, yourself, that “beliefs require evidence.” But this threatens a lie you hold precious; so you reject the dawn inside you, push the Sun back under the horizon.
Or you’ve previously heard the proverb “beliefs require evidence,” and it sounded wise enough, and you endorsed it in public. But it never quite occurred to you, until someone else brought it to your attention, that this proverb could apply to your belief that there’s a dragon in your garage. So you think fast and say, “The dragon is in a separate magisterium.”
Having false beliefs isn’t a good thing, but it doesn’t have to be permanently crippling—if, when you discover your mistake, you get over it. The dangerous thing is to have a false belief that you believe should be protected as a belief—a belief-in-belief, whether or not accompanied by actual belief.
A single Lie That Must Be Protected can block someone’s progress into advanced rationality. No, it’s not harmless fun.
Just as the world itself is more tangled by far than it appears on the surface, so too there are stricter rules of reasoning, constraining belief more strongly, than the untrained would suspect. The world is woven tightly, governed by general laws, and so are rational beliefs.
Think of what it would take to deny evolution or heliocentrism—all the connected truths and governing laws you wouldn’t be allowed to know. Then you can imagine how a single act of self-deception can block off the whole meta level of truth-seeking, once your mind begins to be threatened by seeing the connections. Forbidding all the intermediate and higher levels of the rationalist’s Art. Creating, in its stead, a vast complex of anti-law, rules of anti-thought, general justifications for believing the untrue.
Steven Kaas said, “Promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires.” Giving someone a false belief to protect—convincing them that the belief itself must be defended from any thought that seems to threaten it—well, you shouldn’t do that to someone unless you’d also give them a frontal lobotomy.
Once you tell a lie, the truth is your enemy; and every truth connected to that truth, and every ally of truth in general; all of these you must oppose, to protect the lie. Whether you’re lying to others, or to yourself.
You have to deny that beliefs require evidence, and then you have to deny that maps should reflect territories, and then you have to deny that truth is a good thing . . .
Thus comes into being the Dark Side.
I worry that people aren’t aware of it, or aren’t sufficiently wary—that as we wander through our human world, we can expect to encounter systematically bad epistemology.
The “how to think” memes floating around, the cached thoughts of Deep Wisdom—some of it will be good advice devised by rationalists. But other notions were invented to protect a lie or self-deception: spawned from the Dark Side.
“Everyone has a right to their own opinion.” When you think about it, where was that proverb generated? Is it something that someone would say in the course of protecting a truth, or in the course of protecting from the truth? But people don’t perk up and say, “Aha! I sense the presence of the Dark Side!” As far as I can tell, it’s not widely realized that the Dark Side is out there.
But how else? Whether you’re deceiving others, or just yourself, the Lie That Must Be Protected will propagate recursively through the network of empirical causality, and the network of general empirical rules, and the rules of reasoning themselves, and the understanding behind those rules. If there is good epistemology in the world, and also lies or self-deceptions that people are trying to protect, then there will come into existence bad epistemology to counter the good. We could hardly expect, in this world, to find the Light Side without the Dark Side; there is the Sun, and that which shrinks away and generates a cloaking Shadow.
Mind you, these are not necessarily evil people. The vast majority who go about repeating the Deep Wisdom are more duped than duplicitous, more self-deceived than deceiving. I think.
And it’s surely not my intent to offer you a Fully General Counterargument, so that whenever someone offers you some epistemology you don’t like, you say: “Oh, someone on the Dark Side made that up.” It’s one of the rules of the Light Side that you have to refute the proposition for itself, not by accusing its inventor of bad intentions.
But the Dark Side is out there. Fear is the path that leads to it, and one betrayal can turn you. Not all who wear robes are either Jedi or fakes; there are also the Sith Lords, masters and unwitting apprentices. Be warned; be wary.
As for listing common memes that were spawned by the Dark Side—not random false beliefs, mind you, but bad epistemology, the Generic Defenses of Fail—well, would you care to take a stab at it, dear readers?
1Actually, a geologist in the comments says that most pebbles in driveways are taken from beaches, so they couldn’t tell the difference between a driveway pebble and a beach pebble, but they could tell the difference between a mountain pebble and a driveway/beach pebble (http://lesswrong.com/lw/uy/dark_side_epistemology/4xbv). Case in point . . .
How about "Comparing Apples and Oranges," or "How Dare you Compare," a misrepresentation of the scope of analogies. For a recent example, see the response to John Lewis's drawing an analogy between certain aspects of the McCain campaign and those of George Wallace -- the response is not a consideration of the scope and aptness of the analogy but a rejection that any analogy at all can be drawn between two subjects when one is so generally recognized to be Evil. The McCain campaign does not attempt to differentiate the aspects under analogy (rhetoric and its potential for the fomentation of violence) from those of Wallace, but rather condemns the idea that the analogy can be considered at all. Under the epistemology of Fail, any difference between two subjects of comparison is enough to reject its validity, regardless the relevance of the distinction to the actual comparison being drawn. See also: Godwin's Law.
Some self-entitled males like to use this one, particularly in defense of the notion that one has in inviolate right to make sexual advances toward other people regardless of circumstance or outward sign. Sooner or later, after demonstrating how each of their justifications also justify sexual assault, it leads to "how dare you compare me to a rapist," which is where the fun begins. After I have done epistemologically belittling them I point out that the obvious fact that sexual assault is known to be bad is a manifestation of general principles of ethical interaction among humans, and not a special case handed down from a God who says that everything that is not expressly forbidden by a law is good.
Somehow I doubt that "regardless of circumstance or outward sign" is their wording and not yours.
(Edit) Also, the converse of "not everything that is not expressly forbidden by a law is good" is "not everything that causes the slightest incidental harm is unforgivable babyeating evil".