Previously: Why a New Rationalization Sequence?

What are Red Flags?

A red flag is a warning sign that you may have rationalized. Something that is practical to observe and more likely in the rationalization case than the non-rationalization.

Some are things which are likely to cause rationalization. Others are likely to be caused by it. One on this list is even based on common cause. (I don't have any based on selection on a common effect, but in theory there could be.)

How to Use Red Flags

Seeing a red flag doesn't necessarily mean that you have rationalized, but it's evidence. Likewise, just because you've rationalized doesn't mean your conclusion is wrong, only that it's not as supported as you thought.

So when one of these flags raises, don't give up on ever discovering truth; don't stop-halt-catch-fire; definitely don't invert your conclusion.

Just slow down. Take the hypothesis that you're rationalizing seriously and look for ways to test it. The rest of this sequence will offer tools for the purpose, but just paying attention is half the battle.

A lot of these things can be present to a greater or lesser degree, so you'll want to set thresholds. I'd guess an optimal setting has about 1/3 of triggers be true. High enough that you keep doing your checks seriously, but low because the payoff matrix is quite asymmetrical.

Basically use these as trigger-action planning. Trigger: anything on this list. Action: spend five seconds doing your agenty best to worry about rationalization.

Conflict of Interest

This is a classic reason to distrust someone else's reasoning. If they have something to gain from you believing a conclusion separate from that conclusion being true, you have reason to be suspicious. But what does it mean for you to gain from you believing something apart from it being true?

Not Such Great Liars

Probably the simplest reason is that you need to deceive someone else. If you're not a practiced liar, the easiest way to do this is to deceive yourself.

Simple example: you're running late and need to give an estimate of when you'll arrive. If you say "ten minutes late" and arrive twenty minutes late, it looks like you hit another ten minutes' worth of bad luck, whereas saying "twenty minutes" looks like your fault. You're not good at straight-up lying, but if you can convince yourself you'll only be ten minutes late, all is well.

Unendorsed Values

Values aren't simple, and you aren't always in agreement with yourself. Let's illustrate this with examples:

Perhaps you believe that health and long life are more important that fleeting pleasures like ice cream, but there's a part of you that has a short time preference and knows ice cream is delicious. That part would love to convince the rest of you of a theory of nutrition that holds ice cream as healthy.

Perhaps you believe that you should follow scientific results wherever the evidence leads you, but it seems to be leading someplace that a professor at Duke predicted a few months ago, and there's a part of you that hates Duke. If that part can convince the rest of you that the data is wrong, you won't have to admit that somebody at Duke was right.

Wishful Thinking

A classic cause of rationalization. Expecting good things feels better than expecting bad things, so you'll want to believe it will all come out all right.

Catastrophizing Thinking

The opposite of wishful thinking. I'm not sure what the psychological root is, but it seems common in our community.

Conflict of Ego

The conclusion is: therefore I am a good person. The virtues I am strong at are the most important, and those I am weak at are the least. The work I do is vital to upholding civilization. The actions I took were justified. See Foster & Misra (2013) on Cognitive Dissonance and Affect.

Variant: therefore we are good people. Where "we" can be any group membership the thinker feels strongly about. Note that the individual need not have been involved in the virtue, work or action to feel pressure to rationalize it.

This is particularly insidious when "we" is defined partly by a large set of beliefs, such as the Social Justice Community or the Libertarian Party. Then it is tempting to rationalize that every position "we" have ever taken was correct.

In my experience, the communal variant is more common than the individual one, but that may be an artifact of my social circles.

Reluctance to Test

If you have an opportunity to gain more evidence on the question and feel reluctant, this is a bad sign. This one is illustrated by Harry and Draco discussing Hermione in HPMOR .

Suspicious Timing

Did you stop looking for alternatives as soon as you found this one?

Similarly, did you spend a lot longer looking for evidence on one side than the other?

Failure to Update

This was basically covered in Update Yourself Incrementally and One Argument Against An Army. The pattern of failing to update because the weight of evidence points the other way is a recognizable one.

The Feeling of Doing It

For some people, rationalization has a distinct subjective experience that you can train yourself to recognize. Eliezer writes about it in Singlethink and later refers to it as "don't even start to rationalize".

If anyone has experience trying to develop this skill, please leave a comment.

Agreeing with Idiots

True, reversed stupidity is not intelligence. Nevertheless, if you find yourself arriving at the same conclusion as a large group of idiots, this is a suspicious observation that calls for an explanation. Possibilities include:

  • It's a coincidence: they got lucky. This can happen, but the more complex the conclusion, the less likely.
  • They're not all that idiotic. People with terrible overall epistemics can still have solid understanding within their comfort zones.
  • It's not really the same conclusion; it just sounds it when both are summarized poorly.
  • You and they rationalized the conclusion following the same interest.

Naturally, it is this last possibility that concerns us. The less likely the first three, the more worrying the last one.

Disagreeing with Experts

If someone who is clearly established as an expert in the field (possibly by having notable achievements in it) disagrees with you, this is a bad sign. It's more a warning sign of bad logic in general than of rationalization in particular, but rationalization is a common cause of bad logic, and many of the same checks apply.


Next: Avoiding Rationalization

New Comment
6 comments, sorted by Click to highlight new comments since:

Moral licensing: behaviorally connecting between two causally unrelated things as a justification for one of them.

Tu quoque: looking for ways others have failed to live up to an epistemic standard to excuse yourself. Related to kakonomics.

Becoming defensive and frustrated and retreating to vague language when asked for more specifics.

[-]TAG70

Becoming defensive and frustrated and retreating to vague language when asked for more specifics.

Subvariety: downvoting without replying.

Vague language (and lack of detail) period.

If anyone has experience trying to develop [the skill of noticing what it feels like to rationalize], please leave a comment.

I've developed this skill some. To me, it feels like part of my brain is "slipping sideways", tugging me harder than appropriate towards a particular line of thinking or conclusion. I think I'm reasonably good at noticing rationalization, but part of my brain still tries to rationalize even after I notice it. I want to get better at responding appropriately.

Wishful Thinking
A classic cause of rationalization. Expecting good things feels better than expecting bad things, so you'll want to believe it will all come out all right.
Catastrophizing Thinking
The opposite of wishful thinking. I'm not sure what the psychological root is, but it seems common in our community.

Together these may be black and white thinking.

If anyone has experience trying to develop this skill, please leave a comment.

Imagine two worlds - one where you come to conclusion A, one where you come to conclusion B. Do you have a strong reaction?

Nevertheless, if you find yourself arriving at the same conclusion as a large group of idiots,

Examples? (Aside from, 'the same conclusion as a group because you like the group'.)

True, reversed stupidity is not intelligence. Nevertheless, if you find yourself arriving at the same conclusion as a large group of idiots, this is a suspicious observation that calls for an explanation. Possibilities include:

There's an implicit assumption here that may be worth making explicit: that most of the world has not reached the same conclusion. It's not suspicious unless the large group of idiots disagrees with the majority opinion.