There is a classic distinction between between epistemic and instrumental rationality, and I like to think of epistemic rationality as a special case of instrumental rationality where map-territory correspondence is the terminal goal rather than an instrumental goal.

From LessWrong's Wiki:

Epistemic rationality [...] can be seen as a form of instrumental rationality in which knowledge and truth are goals in themselves, whereas in other forms of instrumental rationality, knowledge and truth are only potential aids to achieving goals. Someone practicing instrumental rationality might even find falsehood useful.

What a pure and beautiful creature the epistemic rationalist is. Imagine how useful it would be if everyone thought I was one.

Indeed, it is instrumentally useful for instrumental rationalists to portray themselves as epistemic rationalists. And so this is a common pattern in human politics - "[insert political coalition] care only about themselves, while [insert political coalition] are merely trying to spread truth" is one of the great political cliches for a reason. And because believing one's own lies can be instrumentally useful, falsely believing oneself to have a holy devotion to the truth is a not-uncommon delusion.

I try to dissuade myself of this delusion. 

To the extent I am rational at all, I am instrumentally rational. I do not want to hold beliefs that will cause me harm, even if they are true. Having said that, to the extent I can modify myself to safely hold these true beliefs, I would like to do so. 

What does it look like to fail at this? 

The Emperor's New Clothes is a nice parable, though the ending needs to be modified; the naively epistemically rational little boy beaten to death, or perhaps institutionalized, the Emperor's legendary invisible fabric becoming ritually significant for generations.

We can look at this boy as a hero who bravely attempted to instigate preference falsification; we can look at him as a fool who naively believed everyone's lie about terminally caring about the truth, or just a tragic figure too young to understand the full weight of his actions.

For the purposes of this post, I want you to see him as a rationalist who has discovered a taboo truth, and feels an unfortunate urge to shout it from the rooftops. This is almost always a pathological urge, a symptom of being in the Valley of Bad Rationality.

To the extent one desires to change social consensus on taboo topics, this should be done strategically and very, very carefully, by those with the social and political clout needed to get away with it. 

If you want to be the hero who instigates preference falsification on your taboo topic of choice, your first goal should be acquiring social and material capital. But note, it is easy, too, to delude yourself into thinking you have more clout than you have, and it is often wiser to support the efforts of someone more capable or more specialized than yourself. 

And indeed, it often wiser still to bow out entirely. 

It is worth asking yourself why you are focused on this particular taboo truth. Why the Emperor's clothes in particular? Why not the Pope's supposed relationship with god? Why not the Cobbler's guild's propaganda on the safety hazards of unauthorized shoe repair? Is it possible you are just latching on to the first "Big Lie" you have encountered since you read a bunch about the virtue of truth? 

And should you think your taboo truth is of great importance, ask yourself again if action on your part is necessary? What sort of action?

Noble lies are usually cheap signals for the individual, even if deleterious for the society as whole. Is technology changing material conditions in such a way to make this signal no longer cheap to the individual? Would you be advocating, at great personal cost, for the inevitable? If it is not, could it? Is there a less socially-legible path to the same end than public advocacy?

If you realize public advocacy of your taboo truth is not worth the trouble, pretend to believe in the noble lie if convenient. And if you can pretend without allowing yourself to believe your own pretense (which few can) even better. 

When falsehoods are requisite, hypocrisy is often a virtue.

New Comment
16 comments, sorted by Click to highlight new comments since:

I think this is impossible to model very well without acknowledging the difference between private beliefs and public statements.  Noticing hypocrisy in others does not require pointing it out (though it's often a kindness, and sometimes a profitable move, to do so).  Noticing hypocrisy in oneself (difference between internal beliefs and actions) can be addressed quietly rather than in public. Intentional hypocrisy in oneself (saying things that differ from your true beliefs) can be rational as well.

Generally speaking, it takes more effort and has pretty common epistemic failure modes to habitually lie to others.  For tactical reasons, it's nice to strive for honesty and public truth.  But it's not actually required for rationality.

There's also a question of context and open but non-published truth.  There are plenty of taboo truths (well, not Truth exactly, more like taboo useful models) that I'm willing to explore among small groups where nuance and acknowledgement of uncertainty is common, but very reluctant to get into in any public sphere where I don't know and trust everyone involved.

Indeed, it is instrumentally useful for instrumental rationalists to portray themselves as epistemic rationalists. And so this is a common pattern in human politics - "[insert political coalition] care only about themselves, while [insert political coalition] are merely trying to spread truth" is one of the great political cliches for a reason. And because believing one's own lies can be instrumentally useful, falsely believing oneself to have a holy devotion to the truth is a not-uncommon delusion.

I try to dissuade myself of this delusion.

There's a subtle paradox here. Can you spot it?

He is trying to dissuade himself of the premise[X] that he is committed to the truth over socially useful falsehoods. But that premise[X] is itself socially useful to believe, and he claims it's false, so disbelieving it would show that he does sometimes value the truth over socially useful falsehoods, contradicting the point.

More specifically, there are three possibilities here:

  1. X is broadly true. He's just wrong about X, but his statement that X is false is not socially motivated.
  2. X is usually false, but his statements about X are a special case for some reason.
  3. X is false, but his statement that X is false doesn't contradict this because denying X is actually the socially useful thing, rather than affirming X. Lesswrong might be the kind of place where denying X (saying that you are committed to spreading socially useful falsehoods over the truth) actually gets you social credit, because readers interpret affirming X as the thing that gets you social credit, so denying it is interpreted as a signal that you are committed to saying the taboo truth (not-X) over what is socially useful (X), the exact opposite of what was stated. If true, this would be quite ironic. This interpretation is self-refuting in multiple ways, both logically (for not-X to be a "taboo truth", X has to be false, which already rules out the conclusion of this line of reasoning) and causally (if everyone uses this logic, the premise that affirming X is socially useful becomes false, because denying X becomes the socially useful thing.) But that doesn't mean readers couldn't actually be drawing this conclusion without noticing the problems.

     

What if you feel the need to proclaim your taboo truth, but you aren't quite sure what it is yet?

Then you will find yourself drawn to those situations in which you can proclaim basically any truth without fear of undue repercussion, and then once you will there you will find yourself pontificating.

Is this a good position to be in? Knowing what the taboo truth is in more detail might lead you to being forced to proclaim it earlier, or else give up on it, which is presumably why its detail has been occluded to you.

(incidentally, I at first read the title as "apply Rationalist Taboo to the word 'truth' " which could also be an interesting exercise)

It is worth asking yourself why you are focused on this particular taboo truth. Why the Emperor's clothes in particular? Why not the Pope's supposed relationship with god? Why not the Cobbler's guild's propaganda on the safety hazards of unauthorized shoe repair? Is it possible you are just latching on to the first "Big Lie" you have encountered since you read a bunch about the virtue of truth? 

I've talked loudly about lots of taboo truths...

Where? I've had a post draft about it forever, but figured it would just knee-jerk downvoted as non-compliant with the prevailing attitudes of epistemic rationality as a search for truth.

It's often hard to get a good handle on a proposition if you don't feel able to talk about it with people who disagree. I've offered in the past to privately go over any potentially dangerous ideas anyone thinks they've found.

Lies which coincide with the enforcement of taboos, or lies which misrepresent the character of people thereby destroying ideal speech situations, are never noble lies. This is not a hard case to make.

I agree with the sentiment here, but I'm not sure that the factual content of this is really significant since I think it's an argument from the definition of "noble lie"

Thinking about it, it seems that if a person desires to point out a taboo truth without being exposed to the potential social/political repercussions, a safer way to do so would be to privately point out the taboo truth to another person who is unaware of said social/political repercussions, and encourage them to point it out instead.

Are you suggesting just letting someone oblivious take the fall or am I misunderstanding?

You could have far more latitude speaking up and making your case on a sensitive topic someone else raised, even if you privately gave them the idea.  Taboos can only really be enforced so long as common knowledge exists that this-is-a-taboo-we're-enforcing; doesn't take much to destroy common knowledge about any social assumption.

This is true, but if the goal is to minimize risk, it's hard to do. It depends in part on the size, coordination, power, and zeal of those who proclaim the falsehood. 

The Catholic Church didn't bother prohibiting Copernicus' ideas until 70+ years after his death, when the reformation was underway and the Church was less obviously dominant and could less easily tolerate internal dissent, and also after Kepler and Brahe both put heliocentrism on a more sound empirical and theoretical footing and associated it with the Protestants. That's why (in addition to annoying and insulting those in power) Galileo was put under house arrest for speaking up, even if he technically published his Dialog with permission from the inquisition. 

What size coalition do you need to quietly assemble, without being caught or anyone breaking the silence, with what kind of structure and voice and power, in order to speak up relatively safely, when trying to speak against a taboo truth? I obviously don't expect anything like a closed-form analytic answer, but unless there's some easily-describable-and-communicable heuristic (with plausible deniability since this is the kind of thing that can come to be looked down on due to association with taboo truths), we're right in the realm of expecting scientists to have, on average, an unreasonable level of political savvy.

I think a key distinction here is any of this only helps if people care more about the truth of the issue at hand than whatever realpolitik considerations the issue has tangentially gotten pulled into.  And yeah, absent "unreasonable levels of political savvy", academics are mostly relying on academic issues usually being far enough from the icky world of politics to be openly discussed, at least outside of a few seriously diseased disciplines where the rot is well and truly set in.  The powers that be seem to only care about the truth of an issue when it starts directly impinging on their day to day; people seem to find it noteworthy when this isn't true of a given leader.

I don't think this will ever be fully predictable.  E.g. in the US I don't think anyone really saw the magnitude of the backlash against election workers, academics, and security folks coming until it became headline news.  And arguably that's what a near-miss looks like.

Yes, "taking the risk" was what I had more in mind, but essentially so.

In addition to the risk that you'll feel bad about yourself for causing someone else to suffer for your truth, there's a significant risk that they'll do a much worse job than you, and make it easier for the truth to be denied.

Good point! If one wants to privately discuss a taboo truth, should one equally emphasize both the "taboo" as well as the "truth" of the matter? On first thought, ethically I would say yes.