Anna Salamon and I are confused. Both of us notice ourselves rationalizing on pretty much a daily basis and have to apply techniques like the Litany of Tarski pretty regularly. But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for. They don't relate to any examples we give, whether fictitious or actual personal examples from our lives. Some of these people show signs of being rather high-level rationalists overall, although some don't.

So, Less Wrong, we're asking for your input on this one. What do you think is going on?

New Comment
89 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I discovered that one of my friends has something similar - perhaps the same thing - going on in her brain, such that she doesn't rationalise. What we managed to sort out, sort of, was that anything was a justification for her: so when she doesn't eat cookies because it would make her gain weight, and also when she doesn't like Brad Pitt "because he's ugly", and also when she doesn't like a book series because it's chauvinistic, and also when she "doesn't like babies", but her friend's baby "is an exception because it's [friend]'s", these all feel like the same thing to her; she can't or won't tell the difference between what I see as a strong reason or a weak reason or a made-up flimsy reason.

A wild theory appears! In probably the deepest moment of introspection for her in that discussion, she said she thinks she might be like this because it gives her 100% confidence in whatever she's doing. Thinking on that, I'm in the mind of the "70% blue, 30% red balls in the urn" game where some human guessers approximate a 7:3 ratio of blue/red guesses, whereas the best strategy is to guess blue all the time. There might be two kinds of people in this... (read more)

-2[anonymous]
"The Limbic system area is the center that is in charge of the immediate reactions in the human brain and is located above the Brainstem. It receives the information from our sense organs even before our thinking brain - Neocortex has processed it, in order to operate emotional instincts that allow an immediate reaction such as self defense. The rational thought and more complex emotional processes are completed only after a few seconds. After the rational processing, one of the two following will occur: The person will rationalize his/her immediate emotions and thus justify his/her basic emotional assumption provided by the subconscious in the first milliseconds. A second event occurs. It is also charged with emotion but has the opposite effect, forcing the person to change his/her basic primary emotional response."
[-]pjeby190

Since rationalizations are usually employed to repair cognitive dissonance, and cognitive dissonance is strongest when image preservation is necessary, one hypothesis would simply be that these people have self-images that don't need much preserving.

Possible test: do these people have an unusually high tolerance for situations and/or self-disclosures that most people would find shameful, humiliating or embarrassing? This might explain a lack of need to rationalize, regardless of the reason for the high tolerance. (For example, we should expect sociopaths to not see any need to rationalize their actions to themselves.]

[Edit to add: not to imply that any of the people in your experiences are sociopaths; just noting that it's another situation where somebody would have a low need for self-image preserving rationalizations.)

1[comment deleted]

In response to the folk suggesting that our questions were just unclear, etc.:

I notice rationalization all the time too (in myself and in others); but there totally seem to be people who don't ever notice it in themselves. Lots of them. Including both folks who seem never to have trained in rationality-type-stuff at all, and folks who have. I ignored my first counter-example, and my second, but not my third and forth; especially after the fourth counter-example kindly allowed us to cross-examine them for some hours, to go try accosting strangers with weird questions and see if they noticed themself rationalizing while approaching said strangers, etc.

Mercurial, and Eliezer, both suggested an analogy to the "thinking in words" vs "thinking in images" thing; some do one and others do another, and many tend to assume that everyone must experience life that way. We all updated toward thinking that there is some actual thing going on here -- something we were initially not modeling.

But, I'm still confused about:

  1. Whether we're just still missing something obvious anyhow. Maybe our fourth counter-example, who consented to answering gobs of questions and trying ex
... (read more)
9Hermione
So, I asked some people as you suggested, but I didn't find anything as interesting as you. Over the last few days I've asked 10 people if they "rationalise", giving them just one example, and all of them have immediately understood and spontaneously come up with valid examples of themselves doing so. Incidentally, I quite often catch myself rationalising, but I really doubt accosting strangers with odd questions would trigger that in me. I'm not sure what else to suggest. Perhaps asking them when they last felt guilty? From the examples the people I mentioned above came up with, guilt seems to be a very strong trigger of rationalisation. An example: "I forgot to call my Mum on her birthday but I told myself she was really busy with the rest of the family".
3Viliam_Bur
Perhaps rationalization is an adaptation that develops when people risk some kind of punishment for their irrationality. We are irrational, and we already suffer the consequences of our irrationality. But if there is additional penalty for admitting irrationality, it gives us incentive to pretend that the irrational decision was in fact rational; to lie to others, and ultimately to lie to ourselves. Admitting irrationality can be a very bad signalling. How exactly does guilt become a part of the equation? Probably by believing that there is no such thing as irrationality, and people are always perfectly following their utility function. So if you forgot to do something, it means you decided not to do it, because it gives you negative utility. So whenever your irrationality harms people around you, it means you hate them. (If your irrationality sometimes harms you, this can be explained away by saying that you didn't really care about something, only pretended it.) From the outside view, our irrationality is not credible -- it may be just a public act, while we are following our true preferences (defined circularly as "that what we are following", plus some possible secrets).
1semanticsfirst
Conflating irrationality with "self deception" here. But you seem to be defining rationality as "utility function" here. How is some idealized idea of "utility function" any different from just "preferences". 
1AnnaSalamon
Much thanks for collecting this data. What example did you use? And what sorts of examples did you get back?
5Hermione
"I'm trying to give up chocolate. Last weekend I saw a delicious cake and I found myself telling myself the only reason I wanted it was to boost my energy levels, hahaha you know the feeling, right?" If they didn't immediately chime in with examples I'd prompt them with "and you know, its not just food, I rationalise all the time" and ask them if they do as well. Over than half of them immediately came up with their own diet-related rationalisations. Of the other 4 I had the "calling my mum" one above, a couple of people who said they often caught themselves coming up with reasons for why they weren't doing their work, and one "the dog wouldn't like to be taken for a walk in this cold weather". The reason I mentioned guilt is that a few of them (I didn't count) explicitly used the word "guilty" (like, I'm too tired to work, so I don't have to feel guilty that I'm out drinking) and one person talked about trying to make himself feel better.
0AnnaSalamon
And, just to check, did you make sure that all the diet-related examples you got were examples of making false excuses to oneself, and not just examples of e.g. previously intending to diet, but then changing one's mind when one saw the chocolate cake?
0Hermione
yep, they were all valid examples of rationalisation
9Morendil
Count me in that group ("hardly ever", maybe). I'm pretty sure that I do rationalize, but I can't recall any explicit occasions of catching myself in the act. I'm pretty sure that I have abandoned beliefs in the past that I clung to for longer than I should have, but it's hard for me to come up with an example right now. Perhaps we differ in the explicitness of the meta-cognition we engage in. When confronted with incontrovertible evidence of my errors, I tend to facepalm, think something like "stupid me", update and move on. I don't generally attempt to classify the mistake into a particular fallacy. Can you share some of the examples you've been using to illustrate rationalization? I'll tell you if I get the same "can't relate to this", or if I can relate but failed to label the equivalent examples in my own past as rationalizations.
9Kaj_Sotala
Another example, from The Righteous Mind:
2Kaj_Sotala
I don't know whether Anna used this as an illustration, but one way by which I tend to notice myself rationalizing is when I'm debating something with somebody. If they successfully attack my position, I might suddenly realize that I'm starting to defend myself with arguments that even I consider bad or even outright fallacious, and that I've generally gone from trying to discover the truth to trying to defend my original position, no matter what its truth value. Another example is that I might decide to do or believe something, feel reluctant to explain my reasons to others because they wouldn't hold up to outside scrutiny, and then realize that wait, if my reasons wouldn't hold up to outside scrutiny they shouldn't hold up to inside scrutiny either. Do you experience either of those?
8Will_Newsome
[temporarily deleting]
5Steve_Rayhawk
I wish there was a more standard term for this than "kinesthetic thinking", that other people would be able to look up and understand what was meant. (A related term is "motor cognition", but that doesn't denote a thinking style. Motor cognition is a theoretical paradigm in cognitive psychology, according to which most cognition is a kind of higher-order motor control/planning activity, connected in a continuous hierarchy with conventional concrete motor control and based on the same method of neural implementation. (See also: precuneus (reflective cognition?); compare perceptual control theory.) Another problem with the term "motor cognition" is that it doesn't convey the important nuance of "higher-order motor planning except without necessarily any concurrent processing of any represented concrete motions". (And the other would-be closest option, "kinesthetic learning", actively denotes the opposite.) Plausibly, people could be trained to introspectively attend to the aspect of cognition which was like motor planning with a combination of TCMS, to inhibit visual and auditory imagery, and cognitive tasks which involved salient constraints and tradeoffs. Maybe the cognitive tasks would also need to have specific positive or negative consequences for apparent execution of recognizable scripts of sequential actions typical of normally learned plans for the task. Some natural tasks, which are not intrinsically verbal or visual, with some of these features would be social reasoning, mathematical proof planning, or software engineering.) I think kinesthetic thinking still has things like rationalization. For example, you might have to commit to regarding a certain planned action a certain way as part of a complex motivational gambit, with the side effect that you commit to pretend that the action will have some other expected value than the one you would normally assign. If this ability to make commitments that affect perceived expected value can be used well, then b
0Eugine_Nier
Aren't there tests for the verbal/visual thinking distinction?
7saturn
After reading the comments here I think I might be a person who doesn't rationalize, or my tendency to do so is well below the norm. I previously thought the Litany of Tarski was about defeating ugh fields; I do experience those. I'm willing to answer questions about it, if that would help.
3AnnaSalamon
Thanks! Could you tell me about ugh fields you've experienced, and about any instances of selective search, fake justification, etc. that you can call to mind? Also, what modality do you usually think in -- words, images, ... ? Also, what do you do when you e.g. desire a cookie, but have previously decided to reduce cookie-consumption?
4saturn
If a thought with unpleasant implications comes up, I'm tempted to quickly switch to a completely different, more pleasant topic. Usually this happens in the context of putting off some unpleasant task, but I could imagine it also happening if I believed in God or had some other highly cherished belief. I can't think of any beliefs that I actually feel that strongly about, though. I do sometimes come up with plausible excuses or fake justifications if I'm doing something that someone might disapprove of, in case they confront me about it. I don't remember ever doing that for my own benefit. I can't remember doing a selective search either, but of course it's possible I do it without being aware of it. I just thought of another thing that might be relevant- I find moralizing less appealing than seems to be typical. I'm not sure how to describe it. Sort of like wordless analogies or relations between concepts, usually combined with some words and images. But also sometimes words or images by themselves. Distract myself by focusing on something else. If my thoughts keep coming back to eating cookies, I might try imagining something disgusting like eating a cookie with maggots in it.
2amcknight
Use it or lose it? Speculation: Keeping your previous beliefs consistent by rationalizing and distracting yourself are both ways to avoid changing your mind or to avoid thinking about unpleasant things. Maybe most people start with both strategies and the one they have success with develops more than the other. If you are really successful at distracting yourself, maybe rationalization skills never really develop to their full irrational potential.
4John_Maxwell
You could try videotaping them in an argument and then go over the videotape looking for rationalizations. This could deal with varying definitions of rationalize. For best results, make the argument about something that people frequently rationalize. Maybe present them with some ambiguous data that might or might not support their political beliefs (several neutral observers say it didn't affect them either way, since it was so ambiguous), and see if it makes them more certain that their political beliefs are true (as you'd expect in a cognitively normal human). I'm assuming you're using "rationalization" as a synonym for "motivated cognition".
2Vladimir_Nesov
Perhaps something related to social inaptness or perceived social status, on the hypothesis that rationalization originates as a social psychological drive? I have a few broken social modes, for example there is no emotional drive to avoid pointing out errors or embarrassing facts to people, so I need to consciously stop myself if that's called for.

It seems introspectively plausible that when you ask people questions like "how do you know when you're rationalizing", they feel like they've been asked a "when did you stop beating your wife" question, and feel initially tempted to react with an "oh yeah, well maybe I don't" regardless of whether it's true.

9Mercurial
That's a good hypothesis. Unfortunately this doesn't come from asking people, "How do you know when you're rationalizing?" or any variant thereof. The original problem arose when we could not for the life of us convey to some individuals why the Litany of Tarski might be useful. We gave examples from our own lives and watched these individuals just blink and say, "Huh. Yeah, I guess I just don't relate to that at all."
0Bruno_Coelho
The questions are similar, but the first is not a trap.
[-]JanetK100

I have a different way to look at this question. (1) introspection is bunk (2) if someone asks us or we ask ourselves why we did something - the answer is a guess, because we have no conscious access to the actual causes of our thoughts and actions (3)we vary in how good we are at guessing and in how honestly they judge themselves and so some people appear to be clearly rationalizing and other appear less so (4) most people are not actually aware that introspection is not direct knowledge but guesswork and so they do not recognize their guesses as guesses but may notice their self-deceptions as deceptions (5) we do not need to know the reasons for our actions unless we judge them as very bad and to be avoided or very good and to be encouraged (6) the appropriate thing in this case is not to ask ourselves why, but to ask ourselves how to change the likelihood of a repeat, up or down. Although we have only guesses about past actions, we can arrange to have some control over future ones (7) the more we know about ourselves, others, our situations, science and so on the better we can answer the how questions.

[-]Dmytry100

For anosognosia to be a common failure mode, and for the split brain patient's peculiar left side's behaviour, the rationalization got to be a common mode of thought. Perhaps there's a module in the brain that prepares speech but has very little impact on the beliefs and decisions; I ask you why your hand is not scratching your back, and it says, because my back is not itching, and that happens to be correct, but it would say same if the arm was paralysed and back was itching and it wasn't 'told' (by the part doing actual thinking) that arm was paralysed.

When you say you aren't rationalizing, perhaps that module still works by rationalization, it just happens to be quite plausible. Maybe that's how construction of sentences works in the first place when talking about nearly anything.

4Andy_McKenzie
Upvoted for mentioning split brain patients. It made me think of a test of the question "does everyone rationalize"? See whether split brain patients (e.g., ask someone who ask worked with them, or read their writings) vary greatly in the confidence that they assign to their rationalizations of their behavior. All I remember reading of is patients who assign very high confidence to these rationalizations, but that could just be publication bias. If there is large variance and especially if some patients don't do it much at all (i.e., when their nonverbal hemisphere is cued to do something, their verbal hemisphere says, I don't really know why I did that action), then that is a sign that some people really don't rationalize much.
0Dmytry
Could be, but they have a lot of time for the other hemisphere to give up explaining, and they have a whole hemisphere there, some of which may not be rationalizing and may prod the rationalizing module with a clue that right side is now independent. What I'm thinking of, is the possibility that speech works by rationalization-style process. If you consider speech from decision theory perspective - speech is just another instance of making moves that affect the future - in principle it is about as relevant to internal decision making as your spinal cord's calculations of the muscle activation potentials. The spinal cord doesn't care what is the reason why you move your hand. The speech module doesn't need to care what are the reasons you do something; it may have them available, but it's task is not to relay those reasons, but to produce some sequence of chirps that work best in given circumstances. To build a sequence of chirps to the circumstances. Furthermore, it is the case that abstract verbal reasoning is fairly ineffective as a decision making tool. In reality, you are not dealing with facts; you aren't even dealing with probabilities; you are dealing with probability distributions over multidimensional space. For very difficult problems with limited computational power, explicit approach can easily be beaten by heuristics in nearly all circumstances (and the way humans reason explicitly, maybe in all circumstances when pencil and paper are not available to aid the explicit reasoning). It does make sense if explicit abstract reasoning is used primarily for speech construction but not for decision making itself. Note: we shouldn't mix up the rationalization with grossly invalid reasoning. The rationalization doesn't restrict itself to invalid reasoning.
2David_Gerard
That's something like how it feels to me, yes. (Edit: the little voice that comes up with justifications for things, I don't mean I've a split brain!)

Some of these people show signs of being rather high-level rationalists overall, although some don't.

I wouldn't necessarily expect there to be a super-strong connection between not rationalizing and being a "high-level rationalist". There are other ways to go systematically wrong than through goal-directed rationalization. As a possibly overlapping point, your concept of "high-level rationalist" probably sneaks in things like intelligence and knowledge that aren't strictly rationality.

5John_Maxwell
Eliezer's "formidability" seems even worse, with its implications of high status.
2Mercurial
Good points. I'm not trying to sneak in connotations, by the way. We're just talking about the fact that these people seem to be quite good at things like goal-factoring, VOI calculations, etc.
3steven0461
I didn't mean to say the sneaking was intentional. VOI calculations seem like they would correlate more with intelligence than rationality as such. I can't find any reference to goal-factoring; what do you mean by that?
0MalcolmOcean
I know this is much later, but for future readers I thought I'd chime in that goal-factoring is a process of breaking your goals down into subgoals and so on. At each level, you ask yourself "what am I trying to achieve with this?" and then ask yourself if there might be a better/cheaper/more efficient way to do so. This is very closely related to the idea of purchasing goods separately. [I attended January 2013 CFAR workshop and volunteered at the March one.]
0[anonymous]
What's goal-factoring? I tried Googling it but didn't find anything. VOI calculations seem like they would correlate more with intelligence and math knowledge than with rationality, so there again I wouldn't expect a strong connection.

I don't think I rationalize to any significant extent. Even the examples I came up with for Anna's thread concern inefficient allocation of attention and using zero-information arguments, not something specifically directed to defense of a position. I admit being wrong or confused on simple things, sometimes incorrectly (so that I have to go back to embrace a momentarily-rejected position). It's possible I'm completely incapable of noticing rationalization and would need a new basic skill to fix that, but doesn't seem very likely.

(Alternatively, perhaps "rationalization" needs to be unpacked a bit, so that problems like those in the examples I referred to above can find a place in that notion. As it is, they seem more like flaws in understanding unbiased with respect to a favored conclusion, unless that conclusion is to be selected in the hindsight.)

[-]Shmi250

Anyone volunteers to go through Vladimir_Nesov's comments on LW and point out his rationalizations to him?

That could actually be quite helpful. No offense to Vladimir; we're just sincerely curious about this phenomenon, and if he's really a case of someone who doesn't relate to Tarski or rationalization, then it'd be helpful to have good evidence one way or the other about whether he rationalizes.

6Mercurial
That's helpful. Thank you. And yes, I agree, the term "rationalization" is a bit loaded. We already checked by tabooing the word in exploring with at least one case, so it's not just that these people freeze at the word "rationalization." But it's quite possible that there are multiple things going on here that only seem similar at first glance.
1Grognor
What about this? Do you not count this because you were sleepy at the time, because it was a minor incident, or what? (Also, I did not go through your comments to find that. Just thought I'd point that out because of shminux's comment.)
4Vladimir_Nesov
I don't remember the experience, but it sounds like a collection of absent-minded system 1 responses that build on each other, there doesn't appear to be a preferred direction to them. This is also the characterization from the comment itself: As I understand, "rationalization" refers to something like optimization of thoughts in the direction of a preferred conclusion, not to any kind of thinking under a misconception. If I believe something wrong, of course I'll be building on the wrong thing and making further wrong conclusions, until I notice that it's wrong.
0wedrifid
I recall you (doing what can most plausibly be described as) rationalizing at times. But perhaps you are right about the 'unpacking' thing. I might be thinking of things entirely different to those that Anna mentioned.
5Vladimir_Nesov
I'd be grateful for specific examples.

I feel that I avoid most rationalizing simply by being very comfortable, perhaps too comfortable, with the possibility that my beliefs may be wrong and my decisions suboptimal. Is the design I came up with for my work project any good? Maybe not. Have I made the right career choices? Probably not. Am I living in the right city? Maybe, maybe not. Is my current life plan going to lead me to happiness? Doubtful.

One data point: I notice myself rationalizing, or starting to rationalize, many times a week.

I might task inexpensive virtual assistants (from third-world countries) with finding YouTube clips of people rationalizing on TV (the easiest candidates are probably Fox News people, politicians, etc.)

6Vladimir_Nesov
Give an example of what kind of event you are referring to?
4NancyLebovitz
What are efficient ways of training assistants to recognize rationalization and/or recognizing that they can already do so?

In a word: compartmentalization.

Since that's not helpful, I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)

Some of these people show signs of being rather

"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean? Are you referring to the ... (read more)

I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)

You'd think not. Yet even Eliezer seems to think that one of our case studies really, truly might not ever rationalize and possibly never has before. This seems to be a case of a beautiful, sane theory beaten to death by a small gang of brutal facts.

"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean?

It means that I don't know how to measure how strong someone's rationality skills are other than talking to others whom I intuitively want to say are good rationalists and comparing notes. So I'm hedging my assertions. But to whatever degree several people at the Singularity Institute are able to figure out who is or is not a reasonably good rationalist, some of our sample "non-rationalizers" appear to us to be good rationalists, and some appear not to be so.

Also, could you explain what you mean by

... (read more)
[-]gwern190

Have you actually tested them for rationalizing? My own beliefs are that it's more likely to run into someone who rationalizes so much they are blind to their own rationalizing (and so can't recall any) than someone who is inhumanly honest.

(Tests in this case would include checking for hindsight bias, which is classic rationalizing, and having them do that test on YourMorals whose name I forget where you're given two summaries of studies for and against gun control and asked to criticize them - usually showing imbalance towards your favored side. But you're a LWer, I'm sure you can think of other tests.)

4Grognor
This is VERY interesting. I'm as baffled as you are, sorry to say. It seems like you've described rationalizations that prevent true (or 'maximally accurate') beliefs. Have you tried asking these case studies their rationales for decision-making? One theme of my rationalization factory is spitting out true but misleading reasons for doing things, rarely allowing me to reason out doing what I know - somehow - that I should. Said factory operates by preventing me from thinking certain thoughts. Perhaps this goes on in these people? I've performed one heck of an update thanks to your comment and realizing that I was generalizing from only a few examples.
1Dentin
I'm pretty sure I'm one of these unusual people. When I first read the litanies, I understood why they might be useful to some people (I have a lot of experience with religious fanatics), but I truly did not understand why they would be so important to Eliezer or other rationalists. I always figured they were meant to be a simple teaching tool, to help get across critical concepts and then to be discarded. Gradually I came to realize that a large percentage of the community use the various litanies on a regular basis. This still confuses me in some cases - for example, it would never even occur to me that evidence/data could simply be ignored or that any rationalization could ever trump it. I suspect this inability to simply ignore inconvenient data is the reason for my low rate of rationalization. I do actually catch myself beginning to rationalize from time to time, but there's always the undercurrent of "wishful thinking isn't real". No matter how hard I rationalize, I cannot make the evidence go away, so the rationalization process gives up quickly. I have been like this for most of my life, and have memories of the "wishful thinking isn't real" effect going all the way back to my early memories of childish daydreaming and complex storytelling.
3Eugine_Nier
This seems wrong, rationalizing is what you do to inconvenient data instead of ignoring it.
3torekp
Speaking for myself, I think that rationalizing does typically (always?) involve ignoring something. Not ignoring the first piece of inconvenient data, necessarily, but the horrible inelegance of my ad-hoc auxiliary hypotheses, or such.
0NancyLebovitz
Another direction for measuring rationality might be how well people maintain their usual level under stress-- this is something which would be harder to find out in conversation.
3TimS
Yeah. Even if one thinks that one never rationalizes, looking at political pundits is pretty strong evidence that some people rationalize a lot.

The cues people have for noticing their rationalizations are things they notice before they're done thinking. They have not rationalized; they had a thought that could lead to rationalization or a feeling they associate with rationalizing. And then they stopped. But there was a large enough time between when they started arguing for a conclusion and when they decided to think about it that they noticed their rationalization. Having a reflex to think about a question fast enough compared to the reflex to rationalize can cause someone to not notice their arg... (read more)

6David_Gerard
I have had periods in my life where I would have been convinced I was thinking absolutely clearly, but in retrospect it was blitheringly obvious that I was rationalising like hell. People's subjective reports turn out to be unreliable, news at 11.
0Mercurial
This is helpful. Thank you!

I've long-since internalized "all stated reasons are post-hoc rationalizations", so I've been gradually losing my ability to pick out "rationalizations" in particular.

That is, when a human answers a query as to their reasons for something, they usually inspect their self-model to guess what course of actions could have led to that outcome (as though predicting a future action). Some guesses are better than others, and we call the bad guesses "rationalizations".

ETA: I wrote this comment before noticing that the cases seem to b... (read more)

After reading the comments I noticed that I had at least two distinct mental processes that I'd been labeling "rationalization".

Process 1: Say I'm late for a meeting, I have noticed that in thinking about saying "Sorry, I'm late" I immediately want to add an explanation for why this isn't my fault.

Process 2: Someone presents an argument for a conclusion I disagree with, I immediately start looking for flaws in it/reasons to dismiss it. As I observed here, this is necessarily even a fallacy.

I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).

One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perha... (read more)

I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).

I absolutely relate. I totally would have said that a week ago. Evidence has smashed my belief's face quite solidly in the nose, though.

One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perhaps a comrpomise they felt uncomfortable making, or an indulgence they knew they were going to regret), and then to ask them what excuses went through their brains to justify it. If someone still denies ever having had such an experience then they are beyond redemption.

That's a good idea, and we did it several times. They sincerely do deny having such experience, but not in a knee-jerk way. It's more like a, "Huh. Hmm. Um... Well, I honestly can't think of something quite like that, but maybe X is similar?" And "X" in this case is something like, "I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway." It's like the need for justification is just missing, at least in their self-reports.

This reminds me of a bit in The Righteous Mind, where Haidt discusses some of his experiments about moral reasoning. When he asked his university students questions like "is it right or wrong for a man to buy a (dead) chicken from a store and then have sex with it before eating it", the students had no problem providing a long list of various justifications pro or con, and generally ending up with an answer like “It’s perverted, but if it’s done in private, it’s his right”. In contrast, when Haidt went to a local McDonalds to ask working-class people the same questions, he tended to get odd looks when he asked them to explain why they thought that the chicken scenario was wrong.

Haidt puts this down to the working-class people having an additional set of moral intuitions, ones where e.g. acts violating someone's purity are considered just as self-evidently bad as acts causing somebody needless pain, and therefore denouncing them as wrong needs no explanation. But I wonder if there's also a component of providing explicit reasons for your actions or moral judgements being to some extent a cultural thing. If there are people who are never asked to provide justifications for ... (read more)

3erratio
I'm like this for my trivial decisions but not for major ones. I virtually never rationalise eating choices, the choice is purely a conflict between deciding whether I'm going to do what I want vs what I ought. I do notice myself rationalising when making more long-term decisions and in arguments - if I'm unsure of a decision I'll sometimes make a list of pros and cons and catch myself trying to rig the outcome (which is an answer in itself, obviously). Or if I get into an argument I sometimes catch myself going into "arguments as soldiers" mode, which feels quite similar to rationalising. Anyway, my point for both is that for me at least, rationalisation only seems to pop up when the stakes are higher. If you gave me your earlier example about wanting to eat pizza and making excuses about calcium, I'd probably look at you as though you had 3 heads too.
2Shephard
Evidence other than the repeated denials of the subjects in question and a non-systematic observation of them acting as largely rational people in most respects? (That's not meant to be rhetorical/mocking - I'm genuinely curious to know where the benefit of the doubt is coming from here) The problem here is that there is a kind of perfectly rational decision making that involves being aware of a detrimental consequence but coming to the conclusion that it's an acceptable cost. In fact that's what "rationalizing" pretends to be. With anything other than overt examples (heavy drug-addiction, beaten spouses staying in a marriage) the only person who can really make the call is the individual (or perhaps, as mentioned above, a close friend). If these people do consider themselves rational, then maybe they would respond to existing psychological and neurological research that emphasizes how prone the mind is to rationalizing (I don't know of any specific studies off the top of my head but both Michael Shermer's "The Believing Brain" and Douglas Kenrick's "Sex, Murder, and the Meaning of Life" touch on this subject). At some point, an intelligent, skeptical person has to admit that the likelihood that they are the exception to the rule is slim.
0Kaj_Sotala
Psychological research tends to be about the average or the typical case. If you e.g. ask the question "does this impulse elict rationalization in people while another impulse doesn't", psychologists generally try to answer that by asking a question like "does this statistical test say that the rationalization scores in the 'rationalization elictation condition' seem to come from a distribution with a higher mean than the rationalization scores in the control condition". Which means that you may (and AFAIK, generally do) have people in the rationalization elictation condition who actually score lower on the rationalization test than some of the people in the control condition, but it's still considered valid to say that the experimental condition causes rationalization - since that's what seems to happen for most people. That's assuming that weird outliers aren't excluded from the analysis before it even gets started. Also, most samples are WEIRD and not very representative of the general population.
0Viliam_Bur
Thanks for this example -- now I can imagine what "never rationalizing" could be like. I did not realize there is a third option besides "rationalizing" and "always acting rationally", and I couldn't believe in people acting always rationally (at least not without proper training; but then they would remember what it was like before training). But the possibility of "acting irrationally but not inventing excuses for it" seems much more plausible.
3David_Gerard
Sounds about right. This would be why science is hard for humans. We wouldn't bother if it didn't work.
[-]Shmi30

But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for.

Don't you have exercises designed to catch people rationalizing? If not, you ought to, if yes, did you catch them rationalizing?

7Mercurial
Getting people to rationalize during a session is actually quite a challenge. What we have are exercises meant to illustrate situations that people might find themselves in where rationalization is likely. And after a dozen or so examples, this particular subgroup - about 25% of our tested population so far! - just flat-out does not relate to any of the examples. However, one of them seemed to get "caught" by one example after a friend of theirs explicitly pointed out the analogy to their life. We haven't yet followed up on that case to explore more solidly whether it's really denial or if it was actually our misunderstanding and this person really doesn't rationalize.
2Shmi
Presumably you can do it for other cognitive biases, so what's so special about this one?

Maybe you could give some examples of the sort of rationalizations you're referring to in your post, so we would better know how to answer your question? I think I might fall into this category, but I might not. I frequently think it would be a good idea for me to do something, but I don't do it and tell myself I lack the necessary psychological strength. Is this rationalizing? Also, I sometimes experience ugh fields around learning things that might be uncomfortable (in this sense a student might be afraid to see what score they got on a test).

I don't cla... (read more)

[-][anonymous]20

I don't notice myself rationalising much at all.

My hypothesis is that I am rationalising and I have not picked up the skill to detect it. Which is confusing, because I regularly interrogate myself and look at my beliefs and such to find irrationalities, but I havn't found any.

Am I doing it wrong? Or am I unussually rational? Placing higher probability on doing it wrong feels like fake humility, but I think its accurate.

I'm having a hard time remembering rationalizing bad decisions, but I'm having an easy time remembering rationalizing bad outcomes. That may be a useful dichotomy to explore.

I think this general phenomenon may have something to do with verbal thinking as suggested below, but I'm not sure that applies to my case. I think I came to terms with my id getting to make a lot of my decisions- and so the primary stated justification is something like "I desire to relax now" rather than "I deserve to relax now," and the superego is just outvoted ... (read more)

I feel like I can relate to that. It's not like I never rationalize, but I always know when I do it. Sometimes It may be pretty faint, but I'll still be aware of it. Whether I allow myself to proceed with justifying a false belief depends on the context. Sometimes it just feels uncomfortable enough to admit to being wrong, sometimes it is efficient to mislead people, and so on.

What is rationalization? To me, it feels like a lower-level, more primitive part of the brain recruiting the verbal centres in an attempt to persuade the higher level part of the brain to do something short-sighted. Perhaps these people are unusually non-conflicted - for example, their their lower levels may have a lower-than-usual time preference, or their higher levels may be too weak to get in the way in the first place.

(I keep wanting to say "id" and "super-ego" here despite knowing that Freud isn't scientific. Are there better terms?).

I'm kinda confused, when people say things like "I'm trying to give up chocolate. Last weekend I saw a delicious cake and I found myself telling myself the only reason I wanted it was to boost my energy levels, hahaha you know the feeling, right?" they don't really believe that, right? I mean is they know the entire time they're breaking away from their 'ideal self' or 'should-be self' and just say things like that as a kind of mock-explanation to fulfill social expectations.

ETA: Whoa, typing this as a stream of thought didn't help me grasp how long the comment was becoming! ETA2: To be clear, I recognize the difference between not doing something and not being aware you're doing something.

I missed this thread when it was originally posted, but anyway...

I'm going to try something that has helped me in the past with these sorts of things. I'm going to write my thoughts as they occur. I've found this helps other peek into my mental state a bit.

Of all the examples of rationalization in this thread, I have no recollection of d... (read more)

I haven't remembered a dream in years. There are three that I have had in my life which I can recount even a bit of (all of which were nightmares, interestingly). I'm pretty sure that I have them all the time because I sometimes wake up with strange images in my head. But these images disappear very quickly and I can't tell someone what I was dreaming about even minutes after waking.

I notice that I sometimes catch myself rationalizing in simple ways, like offering some justification for a shortcoming that I have. But I notice also that I can only think of ... (read more)

7Viliam_Bur
Experiment: Bring a pen and paper to your bed, and when you wake up, the first thing you do (seriously the first; a minute of delay can make a huge difference) write what you remember. If you don't remember the beginning, just quickly start writing from the part you remember. If any idea comes to your head during writing, just make a small note (two-three words) and continue writing. Do this every day, at least 5 days in sequence. Why do I suggest this? My case may be different, but after I wake up and think about something else, I usually forget what my dream was, even forget that I had a dream at all. I would swear that I rarely dream, but when I did this experiment, I had a dream every night (and if I woke up many times during the night, there was a different dream each time). Without the writing I wouldn't even notice. Even the written record seems suspicious -- I read about a dream, and I remember "yeah, I had a dream like this maybe a month ago", then I look at the date and see it was yesterday! So my experience is that my memory is absolutely unreliable in this area. Also, this may be a coincidence, but when I remember a dream, it is usually a bad dream, because it makes me think about it when I wake up. EDIT: Now I realized a similar experiment with rationalization could be useful. :D
2TheOtherDave
In my experience, a recorder works better than pen and paper... it takes long enough for me to get focused enough to write legibly that I lose stuff.
0fburnaby
Yes, I agree that this seems like a good thing to try for both dreaming and rationalization! I've recently gotten myself a notebook for at home, just for doodling ideas about the things I'm reading. It might be a good idea just to try and expand that to dreaming, rationalization and other things too, just to see what comes out. To provide myself more reliable access to an "outside view" of myself.

"I have goofed" is hardly a rationalization, is it?

Or "I did this, because all the elephants are flying." is not making an excuse, IF you really believe that they are indeed flying - either. No matter that at least some elephants are not flying. You just have a wrong belief.

A rationalization is (in the sense of "making excuses"), when you are rationalizing with a knowingly wrong reason.

Would you call THIS comment "a rationalization"?

2David_Gerard
As I noted in the previous thread, I can tell (sometimes) that I'm rationalising, even if my conclusion does turn out to be correct - it's a matter of arriving at a belief by a bad process. (In my case I get my polemicist on even though there's a little voice in my head noticing that my epistemology isn't quite justified. This is harder to notice because my output looks much the same those times I consider that I really do have my epistemological ducks in a row - I have to notice it while I'm doing it.)
1semanticsfirst
Yeah. This pretty much describes my issue with common usages of the term "rationalize" as well most of the comments on this board. It seems people here are calling "rationalize" poor reasoning and simultenously calling rationalize justification provided for action, beliefe, attitude, and conclusion which fulfills an emotional role such as making them not feel guilty, or helping them to avoid aknowleging they violated a value. Thing is... an emotional benefit from a justifcation doesn't mean the justification is false or even insinsere. If "rationalizing" is a "bad process" than fine. Rationalizing is use of logical fallacies. Rationalizing is use of unsound premises. But that's a far cry from the way "rationalizing" tends to be used. "I will eat a piece of cake to raise my energy" is not a bad process either in terms of "fure reason" (classic logic" or in terms of unsound premises. It's just insinsere.