Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

When It's Not Right to be Rational

4 Post author: Annoyance 28 March 2009 04:15PM

By now I expect most of us have acknowledged the importance of being rational.  But as vital as it is to know what principles generally work, it can be even more important to know the exceptions.

As a process of constant self-evaluation and -modification, rationality is capable of adopting new techniques and methodologies even if we don't know how they work.  An 'irrational' action can be rational if we recognize that it functions.  So in an ultimate sense, there are no exceptions to rationality's usefulness.

In a more proximate sense, though, does it have limits?  Are there ever times when it's better *not* to explicitly understand your reasons for acting, when it's better *not* to actively correlate and integrate all your knowledge?

I can think one such case:  It's often better not to look down.

People who don't spend a lot of time living precariously at the edge of long drops don't develop methods of coping.  When they're unexpectedly forced to such heights, they often look down.  Looking down, subcortical instincts are activated that cause them to freeze and panic, overriding their conscious intentions.  This tends to prevent them from accomplishing whatever goals brought them to that location, and in situations where balance is required for safety, the panic instinct can even cause them to fall.

If you don't look down, you may know intellectually that you're above a great height, but at some level your emotions and instincts aren't as triggered.  You don't *appreciate* the height on a subconscious level, and so while you may know you're in danger and be appropriately nervous, your conscious intentions aren't overridden.  You don't freeze.  You can keep your conscious understanding compartmentalized, not bringing to mind information which you possess but don't wish to be aware of.

The general principle seems to be that it is useful to avoid fully integrated awareness of relevant data if acknowledging that data dissolves your ability to regulate your emotions and instincts.  If they run amok, your reason will be unseated.  Careful application of doublethink, and avoiding confronting emotionally-charged facts that aren't absolutely necessary to respond appropriately to the situation, is probably the best course of action.

If you expect that you're going to be dealing with heights in the future, you can train yourself not to fall into vertigo.  But if you don't have opportunities for training down your reactions, not looking down is the next best thing.

Comments (21)

Comment deleted 29 March 2009 05:59:51AM *  [-]
Comment author: thomblake 02 April 2009 08:27:52PM 1 point [-]

Never. You show me a counter example and I'll show you an irrational decision.

So what you're saying is either that (1) 'right' and 'rational' are analytically equivalent, or (2) that your belief is unfalsifiable.

(1) begs the question, and (2) shows that your belief is held irrationally.

Comment author: Lightwave 28 March 2009 08:43:58PM *  6 points [-]

I think you are using "rational" with two different meanings. If looking down will cause you to freeze and panic, then the rational thing is not to look down. If knowledge of the fact you're taking sugar pills destroys the placebo effect, then the rational thing is not to know you're taking sugar pills (assuming you've exhausted all other options). It's either that, or directly hacking your brain.

A better way to describe this might be to call these phenomena "irrational feelings", "irrational reactions", etc. The difference is, they're all unintentional. So while you're always rational in your intentional actions, you can still be unintentionally affected by some irrational feelings or reactions. And you correct for those unintentional reactions (which supposedly you can't just simply remove) by changing your intentional ones (i.e. you intentionally and rationally decide not to look down, because you know you will otherwise be affected by the "irrational reaction" of panicking).

Comment author: Annoyance 29 March 2009 12:31:02AM 0 points [-]

Ah, but you can't choose not to know about the sugar pills. At most, you can choose not to investigate a therapy that seems to be working.

But in terms of developing and extending your powers of rationality, you can't embrace a delusion while at the same time working to be a better rationalist. You have to decide between the spurious benefits of a possible placebo, and being rational.

Since the placebo effect has mostly do with how you feel about how you feel, it wouldn't be very important in any case.

Comment author: Matt_Simpson 29 March 2009 06:26:30AM *  1 point [-]

Let's just be clear. You are very near equivocating on rational. There are two basic definitions, though it may be natural to add more for some reason. Essentially what you are pointing out is that sometimes it's instrumentally rational to be epistemically irrational.

I don't see much of a problem with this. As rationalists, we primarily want to be instrumentally rational. Scratch that, it's the only thing we want (intrinsically). Being epistemically rationally just happens to be the best way to achieve our ends in a large percentage of cases. It also may have a direct component in our utility function, but that's another issue.

Comment author: Annoyance 30 March 2009 03:56:38PM 0 points [-]

There is another definition, one better than either of those two, not only because it is more useful but because it is generally used and recognized.

With sufficiently limited resources, it can be rational (in that sense) to be irrational, if the available resources are sufficiently limited.

Comment author: Matt_Simpson 30 March 2009 06:58:36PM 1 point [-]

I think you forgot to mention what that definition is.

Comment author: Cyan 30 March 2009 07:17:00PM 0 points [-]

Seriously, Annoyance, it wouldn't kill you to link to your own post. Sheesh.

Comment author: Annoyance 29 March 2009 05:48:32PM 0 points [-]

"As rationalists, we primarily want to be instrumentally rational. Scratch that, it's the only thing we want (intrinsically)."

No. I'm not sure why you believe that our wants are outside the domain of rationality's influence, but they are not.

Comment author: timtyler 29 March 2009 05:57:47PM 2 points [-]

Wants are outside the domain of instrumental rationality - by definition. In other words, instrumental rationality can be applied to any goal.

Comment author: Matt_Simpson 29 March 2009 10:47:09PM *  -1 points [-]

The only thing we want is to get the things that we want in the most efficient way possible. In other words, to be instrumentally rational.

Comment author: Annoyance 30 March 2009 03:57:16PM 0 points [-]

If what we want is to reach our wants without using the most efficient way possible, what method should we use?

Comment author: Matt_Simpson 30 March 2009 06:56:10PM 0 points [-]

Efficiency, at least the way I'm using the term, is relative to our values. If we don't want to use the most efficient method possible to achieve something, then something about that method causes it to have a negative term in our utility function which is just large enough to make another alternative look better. So then it really isn't the most efficient alternative we have.

Comment author: Kaj_Sotala 28 March 2009 06:52:08PM 3 points [-]

There's also the placebo effect, which can be useful at times.

Comment author: timtyler 28 March 2009 05:18:18PM 3 points [-]

Self-deception has done us proud. Without it ... we might still be running naked through the forest. ... Self-deception was a splendid adaptation in a world populated by nomadic bands armed with sticks and stones (Smith, 2004) [...] admittedly, bias and self-deception do produce many personal and social benefits. For example, by over-estimating our ability we not only attract social allies, we also raise our self-esteem and happiness (Taylor, 1989), and motivate ourselves to excel (Kitcher, 1990) [...]

Comment author: Rune 28 March 2009 06:45:10PM 2 points [-]

From Scott Aaronson's lecture notes:

"Or take another example: a singles bar. The ones who succeed are the ones best able to convince themselves (at least temporarily) of certain falsehoods: "I'm the hottest guy/girl here." This is a very clear case where irrationality seems to be rational in some sense."

Comment author: gjm 29 March 2009 05:59:32PM 3 points [-]

Speaking of Scott Aaronson, his little fiction on more or less this topic is a delightful (and slightly disturbing) read.

Comment author: anonym 29 March 2009 01:30:14AM 2 points [-]

That's a bad example.

In a singles bar, people don't respond to the beliefs of other people, but to their behavior. The success goes to those who behave appropriately, not to those who hold certain beliefs.

You might say that the best way to behave appropriately is to deceive yourself into thinking you're actually the hottest, but that is not what is going on in this case either. Offer the person a million dollars if they can correctly answer whether they were on average rated higher than anybody else or not by all members of the appropriate sex in the bar. Unless they actually are extremely attractive and probably the most attractive in the bar, their answer will be 'no'.

Comment author: SoullessAutomaton 29 March 2009 02:35:50AM 4 points [-]

You might say that the best way to behave appropriately is to deceive yourself into thinking you're actually the hottest

I think a more emotionally neutral term for this technique would be something like "Method Acting".

Comment author: anonym 29 March 2009 06:36:19AM 2 points [-]

"Method acting" is a very nice metaphor for what's actually going on: filling the mind with and identifying with a personality/character to the exclusion of normal thought processes, in order to more perfectly portray that other personality/character.

It's not self-deception though, no more than a child engrossed in pretending to be a dog is engaged in self-deception.

Comment author: [deleted] 28 March 2009 05:08:55PM 0 points [-]

The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.

Comment author: PhilGoetz 28 March 2009 07:27:36PM 3 points [-]

From "The Call of Cthulhu".