Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

People who "don't rationalize"? [Help Rationality Group figure it out]

12 Post author: Mercurial 02 March 2012 11:38PM

Anna Salamon and I are confused. Both of us notice ourselves rationalizing on pretty much a daily basis and have to apply techniques like the Litany of Tarski pretty regularly. But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for. They don't relate to any examples we give, whether fictitious or actual personal examples from our lives. Some of these people show signs of being rather high-level rationalists overall, although some don't.

So, Less Wrong, we're asking for your input on this one. What do you think is going on?

Comments (85)

Comment author: shokwave 03 March 2012 11:53:02PM 16 points [-]

I discovered that one of my friends has something similar - perhaps the same thing - going on in her brain, such that she doesn't rationalise. What we managed to sort out, sort of, was that anything was a justification for her: so when she doesn't eat cookies because it would make her gain weight, and also when she doesn't like Brad Pitt "because he's ugly", and also when she doesn't like a book series because it's chauvinistic, and also when she "doesn't like babies", but her friend's baby "is an exception because it's [friend]'s", these all feel like the same thing to her; she can't or won't tell the difference between what I see as a strong reason or a weak reason or a made-up flimsy reason.

A wild theory appears! In probably the deepest moment of introspection for her in that discussion, she said she thinks she might be like this because it gives her 100% confidence in whatever she's doing. Thinking on that, I'm in the mind of the "70% blue, 30% red balls in the urn" game where some human guessers approximate a 7:3 ratio of blue/red guesses, whereas the best strategy is to guess blue all the time. There might be two kinds of people in this sense: "modellers", who try to accurately mirror reality as much as possible in order to have good predictive skills, and "one-guessers" who commit to the best pure strategy in order to gain the most reward.

Under this wild theory, the one-guessers would have no reason or need to distinguish between the strength of justifications; they'd simply change their behaviour when a better strategy is offered.

Comment author: [deleted] 13 June 2014 06:58:54AM *  -1 points [-]

"The Limbic system area is the center that is in charge of the immediate reactions in the human brain and is located above the Brainstem. It receives the information from our sense organs even before our thinking brain - Neocortex has processed it, in order to operate emotional instincts that allow an immediate reaction such as self defense. The rational thought and more complex emotional processes are completed only after a few seconds.

After the rational processing, one of the two following will occur:

The person will rationalize his/her immediate emotions and thus justify his/her basic emotional assumption provided by the subconscious in the first milliseconds.
A second event occurs. It is also charged with emotion but has the opposite effect, forcing the person to change his/her basic primary emotional response."
Comment author: pjeby 05 March 2012 05:11:50AM *  12 points [-]

Since rationalizations are usually employed to repair cognitive dissonance, and cognitive dissonance is strongest when image preservation is necessary, one hypothesis would simply be that these people have self-images that don't need much preserving.

Possible test: do these people have an unusually high tolerance for situations and/or self-disclosures that most people would find shameful, humiliating or embarrassing? This might explain a lack of need to rationalize, regardless of the reason for the high tolerance. (For example, we should expect sociopaths to not see any need to rationalize their actions to themselves.]

[Edit to add: not to imply that any of the people in your experiences are sociopaths; just noting that it's another situation where somebody would have a low need for self-image preserving rationalizations.)

Comment author: steven0461 03 March 2012 12:25:17AM *  12 points [-]

It seems introspectively plausible that when you ask people questions like "how do you know when you're rationalizing", they feel like they've been asked a "when did you stop beating your wife" question, and feel initially tempted to react with an "oh yeah, well maybe I don't" regardless of whether it's true.

Comment author: Mercurial 03 March 2012 06:46:07AM 5 points [-]

That's a good hypothesis. Unfortunately this doesn't come from asking people, "How do you know when you're rationalizing?" or any variant thereof. The original problem arose when we could not for the life of us convey to some individuals why the Litany of Tarski might be useful. We gave examples from our own lives and watched these individuals just blink and say, "Huh. Yeah, I guess I just don't relate to that at all."

Comment author: Bruno_Coelho 03 March 2012 01:30:05PM 0 points [-]

The questions are similar, but the first is not a trap.

Comment author: AnnaSalamon 03 March 2012 08:30:24AM 9 points [-]

In response to the folk suggesting that our questions were just unclear, etc.:

I notice rationalization all the time too (in myself and in others); but there totally seem to be people who don't ever notice it in themselves. Lots of them. Including both folks who seem never to have trained in rationality-type-stuff at all, and folks who have. I ignored my first counter-example, and my second, but not my third and forth; especially after the fourth counter-example kindly allowed us to cross-examine them for some hours, to go try accosting strangers with weird questions and see if they noticed themself rationalizing while approaching said strangers, etc.

Mercurial, and Eliezer, both suggested an analogy to the "thinking in words" vs "thinking in images" thing; some do one and others do another, and many tend to assume that everyone must experience life that way. We all updated toward thinking that there is some actual thing going on here -- something we were initially not modeling.

But, I'm still confused about:

  1. Whether we're just still missing something obvious anyhow. Maybe our fourth counter-example, who consented to answering gobs of questions and trying experiments for us, was a fluke? (Try asking people yourself, please; don't just say that it must be experimental error because you don't work that way)
  2. Whether they don't rationalize, or just don't notice themselves rationalizing. (Fourth datapoint seemed to maybe actually never make up reasons for choices; don't have data on the others really).
  3. What exactly the boundaries are on "rationalizing" -- what exactly it is, that a sizable portion of the folks we've talked to never notices themselves doing.
Comment author: Morendil 03 March 2012 12:11:48PM 8 points [-]

there totally seem to be people who don't ever notice it in themselves

Count me in that group ("hardly ever", maybe).

I'm pretty sure that I do rationalize, but I can't recall any explicit occasions of catching myself in the act.

I'm pretty sure that I have abandoned beliefs in the past that I clung to for longer than I should have, but it's hard for me to come up with an example right now.

Perhaps we differ in the explicitness of the meta-cognition we engage in. When confronted with incontrovertible evidence of my errors, I tend to facepalm, think something like "stupid me", update and move on. I don't generally attempt to classify the mistake into a particular fallacy.

Can you share some of the examples you've been using to illustrate rationalization? I'll tell you if I get the same "can't relate to this", or if I can relate but failed to label the equivalent examples in my own past as rationalizations.

Comment author: Kaj_Sotala 24 April 2012 10:13:57AM 6 points [-]

Another example, from The Righteous Mind:

On February 3, 2007, shortly before lunch, I discovered that I was a chronic liar. I was at home, writing a review article on moral psychology, when my wife, Jayne, walked by my desk. In passing, she asked me not to leave dirty dishes on the counter where she prepared our baby’s food. Her request was polite but its tone added a postscript: “As I have asked you a hundred times before.”

My mouth started moving before hers had stopped. Words came out. Those words linked themselves up to say something about the baby having woken up at the same time that our elderly dog barked to ask for a walk and I’m sorry but I just put my breakfast dishes down wherever I could. In my family, caring for a hungry baby and an incontinent dog is a surefire excuse, so I was acquitted. [...]

So there I was at my desk, writing about how people automatically fabricate justifications of their gut feelings, when suddenly I realized that I had just done the same thing with my wife. I disliked being criticized, and I had felt a flash of negativity by the time Jayne had gotten to her third word (“Can you not …”). Even before I knew why she was criticizing me, I knew I disagreed with her (because intuitions come first). The instant I knew the content of the criticism (“… leave dirty dishes on the …”), my inner lawyer went to work searching for an excuse (strategic reasoning second). It’s true that I had eaten breakfast, given Max his first bottle, and let Andy out for his first walk, but these events had all happened at separate times. Only when my wife criticized me did I merge them into a composite image of a harried father with too few hands, and I created this fabrication by the time she had completed her onesentence criticism (“… counter where I make baby food?”). I then lied so quickly and convincingly that my wife and I both believed me.

Comment author: Kaj_Sotala 24 April 2012 09:01:12AM *  1 point [-]

I don't know whether Anna used this as an illustration, but one way by which I tend to notice myself rationalizing is when I'm debating something with somebody. If they successfully attack my position, I might suddenly realize that I'm starting to defend myself with arguments that even I consider bad or even outright fallacious, and that I've generally gone from trying to discover the truth to trying to defend my original position, no matter what its truth value.

Another example is that I might decide to do or believe something, feel reluctant to explain my reasons to others because they wouldn't hold up to outside scrutiny, and then realize that wait, if my reasons wouldn't hold up to outside scrutiny they shouldn't hold up to inside scrutiny either.

Do you experience either of those?

Comment author: Hermione 07 March 2012 03:00:12PM 6 points [-]

So, I asked some people as you suggested, but I didn't find anything as interesting as you. Over the last few days I've asked 10 people if they "rationalise", giving them just one example, and all of them have immediately understood and spontaneously come up with valid examples of themselves doing so.

Incidentally, I quite often catch myself rationalising, but I really doubt accosting strangers with odd questions would trigger that in me. I'm not sure what else to suggest. Perhaps asking them when they last felt guilty? From the examples the people I mentioned above came up with, guilt seems to be a very strong trigger of rationalisation. An example: "I forgot to call my Mum on her birthday but I told myself she was really busy with the rest of the family".

Comment author: Viliam_Bur 08 March 2012 02:00:13PM *  2 points [-]

From the examples the people I mentioned above came up with, guilt seems to be a very strong trigger of rationalisation.

Perhaps rationalization is an adaptation that develops when people risk some kind of punishment for their irrationality.

We are irrational, and we already suffer the consequences of our irrationality. But if there is additional penalty for admitting irrationality, it gives us incentive to pretend that the irrational decision was in fact rational; to lie to others, and ultimately to lie to ourselves. Admitting irrationality can be a very bad signalling.

How exactly does guilt become a part of the equation? Probably by believing that there is no such thing as irrationality, and people are always perfectly following their utility function. So if you forgot to do something, it means you decided not to do it, because it gives you negative utility. So whenever your irrationality harms people around you, it means you hate them. (If your irrationality sometimes harms you, this can be explained away by saying that you didn't really care about something, only pretended it.) From the outside view, our irrationality is not credible -- it may be just a public act, while we are following our true preferences (defined circularly as "that what we are following", plus some possible secrets).

Comment author: AnnaSalamon 07 March 2012 05:32:16PM *  2 points [-]

Much thanks for collecting this data. What example did you use? And what sorts of examples did you get back?

Comment author: Hermione 08 March 2012 11:50:56AM 3 points [-]

"I'm trying to give up chocolate. Last weekend I saw a delicious cake and I found myself telling myself the only reason I wanted it was to boost my energy levels, hahaha you know the feeling, right?" If they didn't immediately chime in with examples I'd prompt them with "and you know, its not just food, I rationalise all the time" and ask them if they do as well.

Over than half of them immediately came up with their own diet-related rationalisations. Of the other 4 I had the "calling my mum" one above, a couple of people who said they often caught themselves coming up with reasons for why they weren't doing their work, and one "the dog wouldn't like to be taken for a walk in this cold weather".

The reason I mentioned guilt is that a few of them (I didn't count) explicitly used the word "guilty" (like, I'm too tired to work, so I don't have to feel guilty that I'm out drinking) and one person talked about trying to make himself feel better.

Comment author: AnnaSalamon 09 March 2012 08:39:56PM 0 points [-]

And, just to check, did you make sure that all the diet-related examples you got were examples of making false excuses to oneself, and not just examples of e.g. previously intending to diet, but then changing one's mind when one saw the chocolate cake?

Comment author: Hermione 12 March 2012 10:18:33PM 0 points [-]

yep, they were all valid examples of rationalisation

Comment author: saturn 03 March 2012 10:09:00AM 6 points [-]

After reading the comments here I think I might be a person who doesn't rationalize, or my tendency to do so is well below the norm. I previously thought the Litany of Tarski was about defeating ugh fields; I do experience those. I'm willing to answer questions about it, if that would help.

Comment author: AnnaSalamon 05 March 2012 08:43:23PM *  2 points [-]

Thanks! Could you tell me about ugh fields you've experienced, and about any instances of selective search, fake justification, etc. that you can call to mind?

Also, what modality do you usually think in -- words, images, ... ?

Also, what do you do when you e.g. desire a cookie, but have previously decided to reduce cookie-consumption?

Comment author: saturn 05 March 2012 11:54:25PM 3 points [-]

Could you tell me about ugh fields you've experienced, and about any instances of selective search, fake justification, etc. that you can call to mind?

If a thought with unpleasant implications comes up, I'm tempted to quickly switch to a completely different, more pleasant topic. Usually this happens in the context of putting off some unpleasant task, but I could imagine it also happening if I believed in God or had some other highly cherished belief. I can't think of any beliefs that I actually feel that strongly about, though.

I do sometimes come up with plausible excuses or fake justifications if I'm doing something that someone might disapprove of, in case they confront me about it. I don't remember ever doing that for my own benefit. I can't remember doing a selective search either, but of course it's possible I do it without being aware of it.

I just thought of another thing that might be relevant- I find moralizing less appealing than seems to be typical.

Also, what modality do you usually think in -- words, images, ... ?

I'm not sure how to describe it. Sort of like wordless analogies or relations between concepts, usually combined with some words and images. But also sometimes words or images by themselves.

Also, what do you do when you e.g. desire a cookie, but have previously decided to reduce cookie-consumption?

Distract myself by focusing on something else. If my thoughts keep coming back to eating cookies, I might try imagining something disgusting like eating a cookie with maggots in it.

Comment author: amcknight 07 March 2012 01:08:24AM 1 point [-]

Use it or lose it? Speculation:
Keeping your previous beliefs consistent by rationalizing and distracting yourself are both ways to avoid changing your mind or to avoid thinking about unpleasant things. Maybe most people start with both strategies and the one they have success with develops more than the other. If you are really successful at distracting yourself, maybe rationalization skills never really develop to their full irrational potential.

Comment author: Will_Newsome 03 March 2012 09:50:42AM *  5 points [-]

[temporarily deleting]

Comment author: Steve_Rayhawk 04 March 2012 09:49:51AM *  3 points [-]

I wish there was a more standard term for this than "kinesthetic thinking", that other people would be able to look up and understand what was meant.

(A related term is "motor cognition", but that doesn't denote a thinking style. Motor cognition is a theoretical paradigm in cognitive psychology, according to which most cognition is a kind of higher-order motor control/planning activity, connected in a continuous hierarchy with conventional concrete motor control and based on the same method of neural implementation. (See also: precuneus (reflective cognition?); compare perceptual control theory.) Another problem with the term "motor cognition" is that it doesn't convey the important nuance of "higher-order motor planning except without necessarily any concurrent processing of any represented concrete motions". (And the other would-be closest option, "kinesthetic learning", actively denotes the opposite.)

Plausibly, people could be trained to introspectively attend to the aspect of cognition which was like motor planning with a combination of TCMS, to inhibit visual and auditory imagery, and cognitive tasks which involved salient constraints and tradeoffs. Maybe the cognitive tasks would also need to have specific positive or negative consequences for apparent execution of recognizable scripts of sequential actions typical of normally learned plans for the task. Some natural tasks, which are not intrinsically verbal or visual, with some of these features would be social reasoning, mathematical proof planning, or software engineering.)

when I am thinking kinesthetically I basically never rationalize as such

I think kinesthetic thinking still has things like rationalization. For example, you might have to commit to regarding a certain planned action a certain way as part of a complex motivational gambit, with the side effect that you commit to pretend that the action will have some other expected value than the one you would normally assign. If this ability to make commitments that affect perceived expected value can be used well, then by default this ability is probably also being used badly.

Could you give more details about the things like rationalization that you were thinking of, and what it feels like deciding not to do them in kinesthetic thinking?

Comment author: Eugine_Nier 03 March 2012 10:15:17PM *  1 point [-]

Unfortunately most people don't have particularly good introspection about their primary thinking style so it might be slightly tricky for you to look for interesting correlations here.

Aren't there tests for the verbal/visual thinking distinction?

Comment author: John_Maxwell_IV 04 March 2012 02:26:34AM *  3 points [-]

You could try videotaping them in an argument and then go over the videotape looking for rationalizations. This could deal with varying definitions of rationalize. For best results, make the argument about something that people frequently rationalize. Maybe present them with some ambiguous data that might or might not support their political beliefs (several neutral observers say it didn't affect them either way, since it was so ambiguous), and see if it makes them more certain that their political beliefs are true (as you'd expect in a cognitively normal human).

I'm assuming you're using "rationalization" as a synonym for "motivated cognition".

Comment author: Vladimir_Nesov 03 March 2012 10:28:46AM *  1 point [-]

Perhaps something related to social inaptness or perceived social status, on the hypothesis that rationalization originates as a social psychological drive? I have a few broken social modes, for example there is no emotional drive to avoid pointing out errors or embarrassing facts to people, so I need to consciously stop myself if that's called for.

Comment author: steven0461 03 March 2012 12:50:32AM 8 points [-]

Some of these people show signs of being rather high-level rationalists overall, although some don't.

I wouldn't necessarily expect there to be a super-strong connection between not rationalizing and being a "high-level rationalist". There are other ways to go systematically wrong than through goal-directed rationalization. As a possibly overlapping point, your concept of "high-level rationalist" probably sneaks in things like intelligence and knowledge that aren't strictly rationality.

Comment author: John_Maxwell_IV 04 March 2012 02:55:13AM 4 points [-]

Eliezer's "formidability" seems even worse, with its implications of high status.

Comment author: Mercurial 03 March 2012 06:56:27AM 1 point [-]

Good points.

I'm not trying to sneak in connotations, by the way. We're just talking about the fact that these people seem to be quite good at things like goal-factoring, VOI calculations, etc.

Comment author: steven0461 03 March 2012 11:15:51PM 2 points [-]

I didn't mean to say the sneaking was intentional. VOI calculations seem like they would correlate more with intelligence than rationality as such. I can't find any reference to goal-factoring; what do you mean by that?

Comment author: malcolmocean 17 March 2013 10:40:57AM 0 points [-]

I know this is much later, but for future readers I thought I'd chime in that goal-factoring is a process of breaking your goals down into subgoals and so on. At each level, you ask yourself "what am I trying to achieve with this?" and then ask yourself if there might be a better/cheaper/more efficient way to do so.

This is very closely related to the idea of purchasing goods separately.

[I attended January 2013 CFAR workshop and volunteered at the March one.]

Comment author: Dmytry 03 March 2012 09:08:35AM *  7 points [-]

For anosognosia to be a common failure mode, and for the split brain patient's peculiar left side's behaviour, the rationalization got to be a common mode of thought. Perhaps there's a module in the brain that prepares speech but has very little impact on the beliefs and decisions; I ask you why your hand is not scratching your back, and it says, because my back is not itching, and that happens to be correct, but it would say same if the arm was paralysed and back was itching and it wasn't 'told' (by the part doing actual thinking) that arm was paralysed.

When you say you aren't rationalizing, perhaps that module still works by rationalization, it just happens to be quite plausible. Maybe that's how construction of sentences works in the first place when talking about nearly anything.

Comment author: Andy_McKenzie 03 March 2012 10:35:42PM 2 points [-]

Upvoted for mentioning split brain patients. It made me think of a test of the question "does everyone rationalize"? See whether split brain patients (e.g., ask someone who ask worked with them, or read their writings) vary greatly in the confidence that they assign to their rationalizations of their behavior. All I remember reading of is patients who assign very high confidence to these rationalizations, but that could just be publication bias. If there is large variance and especially if some patients don't do it much at all (i.e., when their nonverbal hemisphere is cued to do something, their verbal hemisphere says, I don't really know why I did that action), then that is a sign that some people really don't rationalize much.

Comment author: Dmytry 04 March 2012 07:08:22AM *  1 point [-]

Could be, but they have a lot of time for the other hemisphere to give up explaining, and they have a whole hemisphere there, some of which may not be rationalizing and may prod the rationalizing module with a clue that right side is now independent.

What I'm thinking of, is the possibility that speech works by rationalization-style process. If you consider speech from decision theory perspective - speech is just another instance of making moves that affect the future - in principle it is about as relevant to internal decision making as your spinal cord's calculations of the muscle activation potentials. The spinal cord doesn't care what is the reason why you move your hand. The speech module doesn't need to care what are the reasons you do something; it may have them available, but it's task is not to relay those reasons, but to produce some sequence of chirps that work best in given circumstances. To build a sequence of chirps to the circumstances.

Furthermore, it is the case that abstract verbal reasoning is fairly ineffective as a decision making tool. In reality, you are not dealing with facts; you aren't even dealing with probabilities; you are dealing with probability distributions over multidimensional space. For very difficult problems with limited computational power, explicit approach can easily be beaten by heuristics in nearly all circumstances (and the way humans reason explicitly, maybe in all circumstances when pencil and paper are not available to aid the explicit reasoning). It does make sense if explicit abstract reasoning is used primarily for speech construction but not for decision making itself.

Note: we shouldn't mix up the rationalization with grossly invalid reasoning. The rationalization doesn't restrict itself to invalid reasoning.

Comment author: David_Gerard 03 March 2012 10:22:44AM *  0 points [-]

That's something like how it feels to me, yes.

(Edit: the little voice that comes up with justifications for things, I don't mean I've a split brain!)

Comment author: Daniel_Burfoot 03 March 2012 07:51:28PM 5 points [-]

I feel that I avoid most rationalizing simply by being very comfortable, perhaps too comfortable, with the possibility that my beliefs may be wrong and my decisions suboptimal. Is the design I came up with for my work project any good? Maybe not. Have I made the right career choices? Probably not. Am I living in the right city? Maybe, maybe not. Is my current life plan going to lead me to happiness? Doubtful.

Comment author: Grognor 02 March 2012 11:55:28PM *  5 points [-]

In a word: compartmentalization.

Since that's not helpful, I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)

Some of these people show signs of being rather

"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean? Are you referring to the "visible aura of competence" Eliezer talked about in his 'the level above mine' sequence on people who are aspiring rationalists? If so, I'd wager this carries very little information, since you're sampling from aspiring rationalists!

Also, could you explain what you mean by "seem to have little clue what Tarski is for"?

Comment author: Mercurial 03 March 2012 06:54:46AM 7 points [-]

I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)

You'd think not. Yet even Eliezer seems to think that one of our case studies really, truly might not ever rationalize and possibly never has before. This seems to be a case of a beautiful, sane theory beaten to death by a small gang of brutal facts.

"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean?

It means that I don't know how to measure how strong someone's rationality skills are other than talking to others whom I intuitively want to say are good rationalists and comparing notes. So I'm hedging my assertions. But to whatever degree several people at the Singularity Institute are able to figure out who is or is not a reasonably good rationalist, some of our sample "non-rationalizers" appear to us to be good rationalists, and some appear not to be so.

Also, could you explain what you mean by "seem to have little clue what Tarski is for"?

Sure. We tell them the kinds of situations in which Tarski is useful, including some personal examples of our own applications of it, and they just blink at us and completely fail to relate. For instance, I might say, "So once I was walking past a pizza place and smelled pizza. Cheese turns out to be really bad for me, but at the time I was hungry. So I watched my mind construct arguments like, 'I haven't gotten much calcium for the last while.'" Nothing of this sort - fake justification, selective search, nothing - seems to connect to something they can relate to. So they just don't see where they'd ever use Tarski.

And yes, we've had at least one person be openly skeptical that anyone could possibly find Tarski useful because he didn't think anyone rationalized the way we were describing. And another of our case studies seemed to know rationalization only as a joke. ("The cake has fewer calories and doesn't count if I eat it while standing, right?")

Comment author: gwern 03 March 2012 08:28:51AM 15 points [-]

Have you actually tested them for rationalizing? My own beliefs are that it's more likely to run into someone who rationalizes so much they are blind to their own rationalizing (and so can't recall any) than someone who is inhumanly honest.

(Tests in this case would include checking for hindsight bias, which is classic rationalizing, and having them do that test on YourMorals whose name I forget where you're given two summaries of studies for and against gun control and asked to criticize them - usually showing imbalance towards your favored side. But you're a LWer, I'm sure you can think of other tests.)

Comment author: Grognor 03 March 2012 08:35:37AM *  2 points [-]

This is VERY interesting. I'm as baffled as you are, sorry to say.

It seems like you've described rationalizations that prevent true (or 'maximally accurate') beliefs. Have you tried asking these case studies their rationales for decision-making? One theme of my rationalization factory is spitting out true but misleading reasons for doing things, rarely allowing me to reason out doing what I know - somehow - that I should. Said factory operates by preventing me from thinking certain thoughts. Perhaps this goes on in these people?

I've performed one heck of an update thanks to your comment and realizing that I was generalizing from only a few examples.

Comment author: Dentin 04 March 2012 01:45:36AM 1 point [-]

I'm pretty sure I'm one of these unusual people. When I first read the litanies, I understood why they might be useful to some people (I have a lot of experience with religious fanatics), but I truly did not understand why they would be so important to Eliezer or other rationalists. I always figured they were meant to be a simple teaching tool, to help get across critical concepts and then to be discarded.

Gradually I came to realize that a large percentage of the community use the various litanies on a regular basis. This still confuses me in some cases - for example, it would never even occur to me that evidence/data could simply be ignored or that any rationalization could ever trump it.

I suspect this inability to simply ignore inconvenient data is the reason for my low rate of rationalization. I do actually catch myself beginning to rationalize from time to time, but there's always the undercurrent of "wishful thinking isn't real". No matter how hard I rationalize, I cannot make the evidence go away, so the rationalization process gives up quickly.

I have been like this for most of my life, and have memories of the "wishful thinking isn't real" effect going all the way back to my early memories of childish daydreaming and complex storytelling.

Comment author: Eugine_Nier 04 March 2012 03:03:46AM 3 points [-]

I suspect this inability to simply ignore inconvenient data is the reason for my low rate of rationalization.

This seems wrong, rationalizing is what you do to inconvenient data instead of ignoring it.

Comment author: torekp 04 March 2012 08:57:37PM 2 points [-]

Speaking for myself, I think that rationalizing does typically (always?) involve ignoring something. Not ignoring the first piece of inconvenient data, necessarily, but the horrible inelegance of my ad-hoc auxiliary hypotheses, or such.

Comment author: NancyLebovitz 03 March 2012 11:24:22AM 0 points [-]

Another direction for measuring rationality might be how well people maintain their usual level under stress-- this is something which would be harder to find out in conversation.

Comment author: TimS 03 March 2012 12:58:24AM *  2 points [-]

Also, could you explain what you mean by "seem to have little clue what Tarski is for"?

Yeah. Even if one thinks that one never rationalizes, looking at political pundits is pretty strong evidence that some people rationalize a lot.

Comment author: JanetK 03 March 2012 10:53:57AM 7 points [-]

I have a different way to look at this question. (1) introspection is bunk (2) if someone asks us or we ask ourselves why we did something - the answer is a guess, because we have no conscious access to the actual causes of our thoughts and actions (3)we vary in how good we are at guessing and in how honestly they judge themselves and so some people appear to be clearly rationalizing and other appear less so (4) most people are not actually aware that introspection is not direct knowledge but guesswork and so they do not recognize their guesses as guesses but may notice their self-deceptions as deceptions (5) we do not need to know the reasons for our actions unless we judge them as very bad and to be avoided or very good and to be encouraged (6) the appropriate thing in this case is not to ask ourselves why, but to ask ourselves how to change the likelihood of a repeat, up or down. Although we have only guesses about past actions, we can arrange to have some control over future ones (7) the more we know about ourselves, others, our situations, science and so on the better we can answer the how questions.

Comment author: Vladimir_Nesov 03 March 2012 12:25:22AM *  7 points [-]

I don't think I rationalize to any significant extent. Even the examples I came up with for Anna's thread concern inefficient allocation of attention and using zero-information arguments, not something specifically directed to defense of a position. I admit being wrong or confused on simple things, sometimes incorrectly (so that I have to go back to embrace a momentarily-rejected position). It's possible I'm completely incapable of noticing rationalization and would need a new basic skill to fix that, but doesn't seem very likely.

(Alternatively, perhaps "rationalization" needs to be unpacked a bit, so that problems like those in the examples I referred to above can find a place in that notion. As it is, they seem more like flaws in understanding unbiased with respect to a favored conclusion, unless that conclusion is to be selected in the hindsight.)

Comment author: shminux 03 March 2012 04:36:48AM 19 points [-]

Anyone volunteers to go through Vladimir_Nesov's comments on LW and point out his rationalizations to him?

Comment author: Mercurial 03 March 2012 06:59:30AM 10 points [-]

That could actually be quite helpful. No offense to Vladimir; we're just sincerely curious about this phenomenon, and if he's really a case of someone who doesn't relate to Tarski or rationalization, then it'd be helpful to have good evidence one way or the other about whether he rationalizes.

Comment author: Mercurial 03 March 2012 06:58:02AM 4 points [-]

That's helpful. Thank you.

And yes, I agree, the term "rationalization" is a bit loaded. We already checked by tabooing the word in exploring with at least one case, so it's not just that these people freeze at the word "rationalization." But it's quite possible that there are multiple things going on here that only seem similar at first glance.

Comment author: Grognor 16 March 2012 03:06:56AM *  1 point [-]

What about this? Do you not count this because you were sleepy at the time, because it was a minor incident, or what?

(Also, I did not go through your comments to find that. Just thought I'd point that out because of shminux's comment.)

Comment author: Vladimir_Nesov 16 March 2012 09:46:10AM 2 points [-]

I don't remember the experience, but it sounds like a collection of absent-minded system 1 responses that build on each other, there doesn't appear to be a preferred direction to them. This is also the characterization from the comment itself:

My mind confused this single thing for the light turning off, and then produced a whole sequence of complex thoughts around this single confusion, all the way relying on this fact being true.

As I understand, "rationalization" refers to something like optimization of thoughts in the direction of a preferred conclusion, not to any kind of thinking under a misconception. If I believe something wrong, of course I'll be building on the wrong thing and making further wrong conclusions, until I notice that it's wrong.

Comment author: wedrifid 16 March 2012 03:23:32AM 0 points [-]

I don't think I rationalize to any significant extent.

I recall you (doing what can most plausibly be described as) rationalizing at times. But perhaps you are right about the 'unpacking' thing. I might be thinking of things entirely different to those that Anna mentioned.

Comment author: Vladimir_Nesov 16 March 2012 09:32:54AM 3 points [-]

I'd be grateful for specific examples.

Comment author: Eugine_Nier 03 March 2012 10:43:04PM 3 points [-]

After reading the comments I noticed that I had at least two distinct mental processes that I'd been labeling "rationalization".

Process 1: Say I'm late for a meeting, I have noticed that in thinking about saying "Sorry, I'm late" I immediately want to add an explanation for why this isn't my fault.

Process 2: Someone presents an argument for a conclusion I disagree with, I immediately start looking for flaws in it/reasons to dismiss it. As I observed here, this is necessarily even a fallacy.

Comment author: Shephard 03 March 2012 04:50:19AM *  3 points [-]

I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).

One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perhaps a comrpomise they felt uncomfortable making, or an indulgence they knew they were going to regret), and then to ask them what excuses went through their brains to justify it. If someone still denies ever having had such an experience then they are beyond redemption.

Comment author: Mercurial 03 March 2012 07:04:27AM 10 points [-]

I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).

I absolutely relate. I totally would have said that a week ago. Evidence has smashed my belief's face quite solidly in the nose, though.

One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perhaps a comrpomise they felt uncomfortable making, or an indulgence they knew they were going to regret), and then to ask them what excuses went through their brains to justify it. If someone still denies ever having had such an experience then they are beyond redemption.

That's a good idea, and we did it several times. They sincerely do deny having such experience, but not in a knee-jerk way. It's more like a, "Huh. Hmm. Um... Well, I honestly can't think of something quite like that, but maybe X is similar?" And "X" in this case is something like, "I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway." It's like the need for justification is just missing, at least in their self-reports.

Comment author: Kaj_Sotala 24 April 2012 10:08:40AM 6 points [-]

This reminds me of a bit in The Righteous Mind, where Haidt discusses some of his experiments about moral reasoning. When he asked his university students questions like "is it right or wrong for a man to buy a (dead) chicken from a store and then have sex with it before eating it", the students had no problem providing a long list of various justifications pro or con, and generally ending up with an answer like “It’s perverted, but if it’s done in private, it’s his right”. In contrast, when Haidt went to a local McDonalds to ask working-class people the same questions, he tended to get odd looks when he asked them to explain why they thought that the chicken scenario was wrong.

Haidt puts this down to the working-class people having an additional set of moral intuitions, ones where e.g. acts violating someone's purity are considered just as self-evidently bad as acts causing somebody needless pain, and therefore denouncing them as wrong needs no explanation. But I wonder if there's also a component of providing explicit reasons for your actions or moral judgements being to some extent a cultural thing. If there are people who are never asked to provide justifications for their actions, then providing justifications never becomes a part of even their internal reasoning. If we accept the theory that verbal reasoning evolved for persuasion and not for problem-solving, then this would make perfect sense - reasoning is a tool for argumentation, and if you never need to argue for something, then there's also no need to practice arguments related to that in your head.

Actually, Haidt does seem to suggest something like this a bit later, when he discusses cultures with a holistic morality, and says that they often seem to just follow a set of what seems to be (to us) ad-hoc rules, not derivable from any single axiom:

Several of the peculiarities of WEIRD culture can be captured in this simple generalization: The WEIRDer you are, the more you see a world full of separate objects, rather than relationships. It has long been reported that Westerners have a more independent and autonomous concept of the self than do East Asians.3 For example, when asked to write twenty statements beginning with the words “I am …,” Americans are likely to list their own internal psychological characteristics (happy, outgoing, interested in jazz), whereas East Asians are more likely to list their roles and relationships (a son, a husband, an employee of Fujitsu). [...]

Related to this difference in perception is a difference in thinking style. Most people think holistically (seeing the whole context and the relationships among parts), but WEIRD people think more analytically (detaching the focal object from its context, assigning it to a category, and then assuming that what’s true about the category is true about the object).5 Putting this all together, it makes sense that WEIRD philosophers since Kant and Mill have mostly generated moral systems that are individualistic, rulebased, and universalist. That’s the morality you need to govern a society of autonomous individuals.

But when holistic thinkers in a non-WEIRD culture write about morality, we get something more like the Analects of Confucius, a collection of aphorisms and anecdotes that can’t be reduced to a single rule.6 Confucius talks about a variety of relationship-specific duties and virtues (such as filial piety and the proper treatment of one’s subordinates).

If WEIRD and non-WEIRD people think differently and see the world differently, then it stands to reason that they’d have different moral concerns. If you see a world full of individuals, then you’ll want the morality of Kohlberg and Turiel—a morality that protects those individuals and their individual rights. You’ll emphasize concerns about harm and fairness.

But if you live in a non-WEIRD society in which people are more likely to see relationships, contexts, groups, and institutions, then you won’t be so focused on protecting individuals. You’ll have a more sociocentric morality, which means (as Shweder described it back in chapter 1) that you place the needs of groups and institutions first, often ahead of the needs of individuals. If you do that, then a morality based on concerns about harm and fairness won’t be sufficient. You’ll have additional concerns, and you’ll need additional virtues to bind people together.

One might hypothesize that moral systems like utilitarianism or Kantian deontology, derived from a small set of logical axioms, are appealing specifically to those people who've learned that they need to defend their actions and beliefs (and who therefore also rationalize) - since it's easier to craft elaborate and coherent defenses of them. People with less of a need for justifying themselves might be fine with Analects of Confucius -style moralities.

Comment author: Shephard 04 March 2012 02:42:02AM 3 points [-]

Evidence has smashed my belief's face quite solidly in the nose, though.

Evidence other than the repeated denials of the subjects in question and a non-systematic observation of them acting as largely rational people in most respects? (That's not meant to be rhetorical/mocking - I'm genuinely curious to know where the benefit of the doubt is coming from here)

"I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway."

The problem here is that there is a kind of perfectly rational decision making that involves being aware of a detrimental consequence but coming to the conclusion that it's an acceptable cost. In fact that's what "rationalizing" pretends to be. With anything other than overt examples (heavy drug-addiction, beaten spouses staying in a marriage) the only person who can really make the call is the individual (or perhaps, as mentioned above, a close friend).

If these people do consider themselves rational, then maybe they would respond to existing psychological and neurological research that emphasizes how prone the mind is to rationalizing (I don't know of any specific studies off the top of my head but both Michael Shermer's "The Believing Brain" and Douglas Kenrick's "Sex, Murder, and the Meaning of Life" touch on this subject). At some point, an intelligent, skeptical person has to admit that the likelihood that they are the exception to the rule is slim.

Comment author: Kaj_Sotala 24 April 2012 09:31:19AM 0 points [-]

If these people do consider themselves rational, then maybe they would respond to existing psychological and neurological research that emphasizes how prone the mind is to rationalizing (I don't know of any specific studies off the top of my head but both Michael Shermer's "The Believing Brain" and Douglas Kenrick's "Sex, Murder, and the Meaning of Life" touch on this subject). At some point, an intelligent, skeptical person has to admit that the likelihood that they are the exception to the rule is slim.

Psychological research tends to be about the average or the typical case. If you e.g. ask the question "does this impulse elict rationalization in people while another impulse doesn't", psychologists generally try to answer that by asking a question like "does this statistical test say that the rationalization scores in the 'rationalization elictation condition' seem to come from a distribution with a higher mean than the rationalization scores in the control condition". Which means that you may (and AFAIK, generally do) have people in the rationalization elictation condition who actually score lower on the rationalization test than some of the people in the control condition, but it's still considered valid to say that the experimental condition causes rationalization - since that's what seems to happen for most people. That's assuming that weird outliers aren't excluded from the analysis before it even gets started. Also, most samples are WEIRD and not very representative of the general population.

Comment author: erratio 04 March 2012 01:23:33AM *  2 points [-]

"I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway."

I'm like this for my trivial decisions but not for major ones. I virtually never rationalise eating choices, the choice is purely a conflict between deciding whether I'm going to do what I want vs what I ought.

I do notice myself rationalising when making more long-term decisions and in arguments - if I'm unsure of a decision I'll sometimes make a list of pros and cons and catch myself trying to rig the outcome (which is an answer in itself, obviously). Or if I get into an argument I sometimes catch myself going into "arguments as soldiers" mode, which feels quite similar to rationalising.

Anyway, my point for both is that for me at least, rationalisation only seems to pop up when the stakes are higher. If you gave me your earlier example about wanting to eat pizza and making excuses about calcium, I'd probably look at you as though you had 3 heads too.

Comment author: Viliam_Bur 03 March 2012 10:32:02AM 0 points [-]

And "X" in this case is something like, "I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway." It's like the need for justification is just missing, at least in their self-reports.

Thanks for this example -- now I can imagine what "never rationalizing" could be like.

I did not realize there is a third option besides "rationalizing" and "always acting rationally", and I couldn't believe in people acting always rationally (at least not without proper training; but then they would remember what it was like before training). But the possibility of "acting irrationally but not inventing excuses for it" seems much more plausible.

Comment author: David_Gerard 03 March 2012 10:24:13AM 1 point [-]

In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).

Sounds about right. This would be why science is hard for humans. We wouldn't bother if it didn't work.

Comment author: shminux 03 March 2012 04:39:25AM 3 points [-]

But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for.

Don't you have exercises designed to catch people rationalizing? If not, you ought to, if yes, did you catch them rationalizing?

Comment author: Mercurial 03 March 2012 07:06:53AM 4 points [-]

Don't you have exercises designed to catch people rationalizing? If not, you ought to, if yes, did you catch them rationalizing?

Getting people to rationalize during a session is actually quite a challenge. What we have are exercises meant to illustrate situations that people might find themselves in where rationalization is likely. And after a dozen or so examples, this particular subgroup - about 25% of our tested population so far! - just flat-out does not relate to any of the examples.

However, one of them seemed to get "caught" by one example after a friend of theirs explicitly pointed out the analogy to their life. We haven't yet followed up on that case to explore more solidly whether it's really denial or if it was actually our misunderstanding and this person really doesn't rationalize.

Comment author: shminux 03 March 2012 07:57:59AM 2 points [-]

Getting people to rationalize during a session is actually quite a challenge.

Presumably you can do it for other cognitive biases, so what's so special about this one?

Comment author: EchoingHorror 03 March 2012 03:39:42AM 3 points [-]

The cues people have for noticing their rationalizations are things they notice before they're done thinking. They have not rationalized; they had a thought that could lead to rationalization or a feeling they associate with rationalizing. And then they stopped. But there was a large enough time between when they started arguing for a conclusion and when they decided to think about it that they noticed their rationalization. Having a reflex to think about a question fast enough compared to the reflex to rationalize can cause someone to not notice their arguments for some answer, then say they never rationalize, or just not rationalize.

I don't relate to anyone's examples of their own rationalizations or have use for the Litany of Tarski except for explaining myself to people who don't think deliberately. I would say I never rationalize if that is the alternative to giving an example of a time when I did because I haven't noticed such an example. But I also know that I am not in conscious control of most of my thought process, and that enumerating potential evidence for a hypothesis looks suspiciously like rationalization, so I would say I do rationalize if I can explain that instead of giving examples. Rationalization can occur subconsciously and not be recognized as a rationalization if it is not allowed to corrupt the whole line of thinking.

Comment author: David_Gerard 03 March 2012 10:26:28AM 3 points [-]

Having a reflex to think about a question fast enough compared to the reflex to rationalize can cause someone to not notice their arguments for some answer, then say they never rationalize, or just not rationalize.

I have had periods in my life where I would have been convinced I was thinking absolutely clearly, but in retrospect it was blitheringly obvious that I was rationalising like hell.

People's subjective reports turn out to be unreliable, news at 11.

Comment author: Mercurial 03 March 2012 07:08:04AM 0 points [-]

This is helpful. Thank you!

Comment author: thomblake 29 May 2012 02:50:08PM *  2 points [-]

I've long-since internalized "all stated reasons are post-hoc rationalizations", so I've been gradually losing my ability to pick out "rationalizations" in particular.

That is, when a human answers a query as to their reasons for something, they usually inspect their self-model to guess what course of actions could have led to that outcome (as though predicting a future action). Some guesses are better than others, and we call the bad guesses "rationalizations".

ETA: I wrote this comment before noticing that the cases seem to be talking about rationalizations that take place before performing an action (like rationalizing impending purchase of pizza by referring to calcium), which is surprising to me. I'm not sure how to spot those, or whether I do them.

Comment author: ShardPhoenix 03 March 2012 12:38:59PM 2 points [-]

What is rationalization? To me, it feels like a lower-level, more primitive part of the brain recruiting the verbal centres in an attempt to persuade the higher level part of the brain to do something short-sighted. Perhaps these people are unusually non-conflicted - for example, their their lower levels may have a lower-than-usual time preference, or their higher levels may be too weak to get in the way in the first place.

(I keep wanting to say "id" and "super-ego" here despite knowing that Freud isn't scientific. Are there better terms?).

Comment author: lukeprog 03 March 2012 01:30:54AM *  2 points [-]

One data point: I notice myself rationalizing, or starting to rationalize, many times a week.

I might task inexpensive virtual assistants (from third-world countries) with finding YouTube clips of people rationalizing on TV (the easiest candidates are probably Fox News people, politicians, etc.)

Comment author: Vladimir_Nesov 03 March 2012 02:08:54AM 3 points [-]

One data point: I notice myself rationalizing, or starting to rationalize, many times a week.

Give an example of what kind of event you are referring to?

Comment author: NancyLebovitz 03 March 2012 11:26:45AM 2 points [-]

What are efficient ways of training assistants to recognize rationalization and/or recognizing that they can already do so?

Comment author: John_Maxwell_IV 04 March 2012 02:07:33AM *  1 point [-]

Maybe you could give some examples of the sort of rationalizations you're referring to in your post, so we would better know how to answer your question? I think I might fall into this category, but I might not. I frequently think it would be a good idea for me to do something, but I don't do it and tell myself I lack the necessary psychological strength. Is this rationalizing? Also, I sometimes experience ugh fields around learning things that might be uncomfortable (in this sense a student might be afraid to see what score they got on a test).

I don't claim to have always been this way; I was famous for lawyer style argument in elementary school. Just with exposure to less wrong (which I found somewhat early, at age 15).

Edit: I now no longer claim to not rationalize, only to do it fairly little, and I'm fairly uncertain about the entire issue.

Comment author: fburnaby 03 March 2012 11:47:02PM 1 point [-]

I haven't remembered a dream in years. There are three that I have had in my life which I can recount even a bit of (all of which were nightmares, interestingly). I'm pretty sure that I have them all the time because I sometimes wake up with strange images in my head. But these images disappear very quickly and I can't tell someone what I was dreaming about even minutes after waking.

I notice that I sometimes catch myself rationalizing in simple ways, like offering some justification for a shortcoming that I have. But I notice also that I can only think of one example in my life where I've done this... Yet I have an impression that I do it all the time.

I wonder if there's... something in common there? How many people don't tend to remember their dreams?

Comment author: Viliam_Bur 04 March 2012 01:09:52PM *  6 points [-]

I'm pretty sure that I have them all the time because I sometimes wake up with strange images in my head. But these images disappear very quickly and I can't tell someone what I was dreaming about even minutes after waking.

Experiment: Bring a pen and paper to your bed, and when you wake up, the first thing you do (seriously the first; a minute of delay can make a huge difference) write what you remember. If you don't remember the beginning, just quickly start writing from the part you remember. If any idea comes to your head during writing, just make a small note (two-three words) and continue writing. Do this every day, at least 5 days in sequence.

Why do I suggest this? My case may be different, but after I wake up and think about something else, I usually forget what my dream was, even forget that I had a dream at all. I would swear that I rarely dream, but when I did this experiment, I had a dream every night (and if I woke up many times during the night, there was a different dream each time). Without the writing I wouldn't even notice. Even the written record seems suspicious -- I read about a dream, and I remember "yeah, I had a dream like this maybe a month ago", then I look at the date and see it was yesterday! So my experience is that my memory is absolutely unreliable in this area. Also, this may be a coincidence, but when I remember a dream, it is usually a bad dream, because it makes me think about it when I wake up.

EDIT: Now I realized a similar experiment with rationalization could be useful. :D

Comment author: TheOtherDave 04 March 2012 09:31:38PM 1 point [-]

In my experience, a recorder works better than pen and paper... it takes long enough for me to get focused enough to write legibly that I lose stuff.

Comment author: fburnaby 04 March 2012 03:01:14PM 0 points [-]

Yes, I agree that this seems like a good thing to try for both dreaming and rationalization! I've recently gotten myself a notebook for at home, just for doodling ideas about the things I'm reading. It might be a good idea just to try and expand that to dreaming, rationalization and other things too, just to see what comes out. To provide myself more reliable access to an "outside view" of myself.

Comment author: [deleted] 03 March 2012 08:01:14PM 1 point [-]

I don't notice myself rationalising much at all.

My hypothesis is that I am rationalising and I have not picked up the skill to detect it. Which is confusing, because I regularly interrogate myself and look at my beliefs and such to find irrationalities, but I havn't found any.

Am I doing it wrong? Or am I unussually rational? Placing higher probability on doing it wrong feels like fake humility, but I think its accurate.

Comment author: Vaniver 03 March 2012 06:27:16PM 1 point [-]

I'm having a hard time remembering rationalizing bad decisions, but I'm having an easy time remembering rationalizing bad outcomes. That may be a useful dichotomy to explore.

I think this general phenomenon may have something to do with verbal thinking as suggested below, but I'm not sure that applies to my case. I think I came to terms with my id getting to make a lot of my decisions- and so the primary stated justification is something like "I desire to relax now" rather than "I deserve to relax now," and the superego is just outvoted rather than placated. (The first strikes me as a rationale, and the second strikes me as a rationalization.)

It may be valuable to ask them about those concepts (or an updated version of them; I like saying id and superego because most people know what they mean but I also get the impression they're fairly outdated). When they say "I knew the cookie was bad for me but I felt like eating it," explore the difference between "knew" and "felt." Maybe rationalizers have a verbal id and others don't?

Comment author: A4FB53AC 03 March 2012 02:05:50PM 1 point [-]

I feel like I can relate to that. It's not like I never rationalize, but I always know when I do it. Sometimes It may be pretty faint, but I'll still be aware of it. Whether I allow myself to proceed with justifying a false belief depends on the context. Sometimes it just feels uncomfortable enough to admit to being wrong, sometimes it is efficient to mislead people, and so on.

Comment author: Azuth 26 May 2012 12:54:59PM 0 points [-]

I'm kinda confused, when people say things like "I'm trying to give up chocolate. Last weekend I saw a delicious cake and I found myself telling myself the only reason I wanted it was to boost my energy levels, hahaha you know the feeling, right?" they don't really believe that, right? I mean is they know the entire time they're breaking away from their 'ideal self' or 'should-be self' and just say things like that as a kind of mock-explanation to fulfill social expectations.

Comment author: Dustin 21 April 2012 01:06:22AM *  0 points [-]

ETA: Whoa, typing this as a stream of thought didn't help me grasp how long the comment was becoming! ETA2: To be clear, I recognize the difference between not doing something and not being aware you're doing something.

I missed this thread when it was originally posted, but anyway...

I'm going to try something that has helped me in the past with these sorts of things. I'm going to write my thoughts as they occur. I've found this helps other peek into my mental state a bit.

Of all the examples of rationalization in this thread, I have no recollection of doing any of them or any of the type.

I often do things that I shouldn't if I want to reach my goals, but I don't have any sort of urge to rationalize them. For example, I like soda. I go through cycles where I really try to stop drinking so much, but I have no recollection of trying to rationalize the drinking of soda.

When I do things that I probably shouldn't, I just accept the fact that one part of my brain wants something that I don't want, or will wish that I didn't want. When I say I "accept the fact", that doesn't mean it doesn't have any sort of emotional impact. I'll kick myself for it. I spend a lot of time with this constant low-level background self-kicking emotion.

When I spend too much time on the internet, I can't say I'm even tempted to make an argument to myself that it's a good idea.

I can think of a bunch of examples where I fail to meet whatever standard I wish I could live up to, but I don't.

For what it's worth, I would classify myself as a very rational person...certainly the most rational person amongst those I've met in real life and known well enough to attempt to classify their rationality level. I'm not positive how much that means, since I live in a rural area and off-hand I can't recall knowing a single person who would even be interested in most of what is talked about on LW.

I'm actually really glad to see this posted. I've often thought about how I don't quite understand how the Litany of Tarski would help me. I'm constantly waffling back and forth between thinking that I must not rationalize, or I'm really bad at noticing it.

Now, another thing comes to mind. I assume we're talking about internal rationalization. I will occasionally present a rationalization for bad behavior to others who call me on it. Like if I'm in a phase where I've given up trying to cut out soda drinking, and my wife raises her eyebrow at me buying a 32 oz Dr. Pepper at the gas station, I may spit out some rationalization. I'm completely aware at the time that I'm just using a placation in a (futile) attempt to placate her.

Just now ... reading over the Litany again ... I can see the point of "Let me not become attached to beliefs I may not want.". I can feel that attachment to various ideas and beliefs. I just can't say with any level of confidence that I've ever tried to fool myself into sticking with one of those ideas longer than I should.

Ooo. I just had another thought. Does timescale come in to play? I think I've offered explanations to myself or to others, that, as I thought/conversed about them more, I decided I didn't believe in that explanation. It's possible those explanations originally came out of whatever part of my mind that rationalizations come out of. I don't know. The reason I started this paragraph off asking about timescales, when I'm thinking about something I'm fairly confident that I hold these "explanations" for very short amounts of time relative to the evidence available, but I can't say whether I hold them longer than I should.

I think rationalizations offered to others are more tricky. Is the Litany supposed to cover efficient argumentation? Like I mentioned in the example about my wife and soda, there's social and emotional reasons that someone would proffer a rationalization for longer than they "should". Sometimes, in an argument, I may stick to a rationalization without even realizing it. That realization always comes to me as soon as the social/emotional aspects of the argument have passed.

I'm not sure how much of that is rationalization, and how much of it is just reasoning myself to a conclusion.

Anyway, hopefully that offered some insight into the mind of someone who doesn't recall rationalizing.

Comment author: Thomas 03 March 2012 09:27:14AM *  0 points [-]

"I have goofed" is hardly a rationalization, is it?

Or "I did this, because all the elephants are flying." is not making an excuse, IF you really believe that they are indeed flying - either. No matter that at least some elephants are not flying. You just have a wrong belief.

A rationalization is (in the sense of "making excuses"), when you are rationalizing with a knowingly wrong reason.

Would you call THIS comment "a rationalization"?

Comment author: David_Gerard 03 March 2012 10:22:10AM *  0 points [-]

As I noted in the previous thread, I can tell (sometimes) that I'm rationalising, even if my conclusion does turn out to be correct - it's a matter of arriving at a belief by a bad process. (In my case I get my polemicist on even though there's a little voice in my head noticing that my epistemology isn't quite justified. This is harder to notice because my output looks much the same those times I consider that I really do have my epistemological ducks in a row - I have to notice it while I'm doing it.)