I discovered that one of my friends has something similar - perhaps the same thing - going on in her brain, such that she doesn't rationalise. What we managed to sort out, sort of, was that anything was a justification for her: so when she doesn't eat cookies because it would make her gain weight, and also when she doesn't like Brad Pitt "because he's ugly", and also when she doesn't like a book series because it's chauvinistic, and also when she "doesn't like babies", but her friend's baby "is an exception because it's [friend]'s", these all feel like the same thing to her; she can't or won't tell the difference between what I see as a strong reason or a weak reason or a made-up flimsy reason.
A wild theory appears! In probably the deepest moment of introspection for her in that discussion, she said she thinks she might be like this because it gives her 100% confidence in whatever she's doing. Thinking on that, I'm in the mind of the "70% blue, 30% red balls in the urn" game where some human guessers approximate a 7:3 ratio of blue/red guesses, whereas the best strategy is to guess blue all the time. There might be two kinds of people in this...
Since rationalizations are usually employed to repair cognitive dissonance, and cognitive dissonance is strongest when image preservation is necessary, one hypothesis would simply be that these people have self-images that don't need much preserving.
Possible test: do these people have an unusually high tolerance for situations and/or self-disclosures that most people would find shameful, humiliating or embarrassing? This might explain a lack of need to rationalize, regardless of the reason for the high tolerance. (For example, we should expect sociopaths to not see any need to rationalize their actions to themselves.]
[Edit to add: not to imply that any of the people in your experiences are sociopaths; just noting that it's another situation where somebody would have a low need for self-image preserving rationalizations.)
In response to the folk suggesting that our questions were just unclear, etc.:
I notice rationalization all the time too (in myself and in others); but there totally seem to be people who don't ever notice it in themselves. Lots of them. Including both folks who seem never to have trained in rationality-type-stuff at all, and folks who have. I ignored my first counter-example, and my second, but not my third and forth; especially after the fourth counter-example kindly allowed us to cross-examine them for some hours, to go try accosting strangers with weird questions and see if they noticed themself rationalizing while approaching said strangers, etc.
Mercurial, and Eliezer, both suggested an analogy to the "thinking in words" vs "thinking in images" thing; some do one and others do another, and many tend to assume that everyone must experience life that way. We all updated toward thinking that there is some actual thing going on here -- something we were initially not modeling.
But, I'm still confused about:
It seems introspectively plausible that when you ask people questions like "how do you know when you're rationalizing", they feel like they've been asked a "when did you stop beating your wife" question, and feel initially tempted to react with an "oh yeah, well maybe I don't" regardless of whether it's true.
I have a different way to look at this question. (1) introspection is bunk (2) if someone asks us or we ask ourselves why we did something - the answer is a guess, because we have no conscious access to the actual causes of our thoughts and actions (3)we vary in how good we are at guessing and in how honestly they judge themselves and so some people appear to be clearly rationalizing and other appear less so (4) most people are not actually aware that introspection is not direct knowledge but guesswork and so they do not recognize their guesses as guesses but may notice their self-deceptions as deceptions (5) we do not need to know the reasons for our actions unless we judge them as very bad and to be avoided or very good and to be encouraged (6) the appropriate thing in this case is not to ask ourselves why, but to ask ourselves how to change the likelihood of a repeat, up or down. Although we have only guesses about past actions, we can arrange to have some control over future ones (7) the more we know about ourselves, others, our situations, science and so on the better we can answer the how questions.
For anosognosia to be a common failure mode, and for the split brain patient's peculiar left side's behaviour, the rationalization got to be a common mode of thought. Perhaps there's a module in the brain that prepares speech but has very little impact on the beliefs and decisions; I ask you why your hand is not scratching your back, and it says, because my back is not itching, and that happens to be correct, but it would say same if the arm was paralysed and back was itching and it wasn't 'told' (by the part doing actual thinking) that arm was paralysed.
When you say you aren't rationalizing, perhaps that module still works by rationalization, it just happens to be quite plausible. Maybe that's how construction of sentences works in the first place when talking about nearly anything.
Some of these people show signs of being rather high-level rationalists overall, although some don't.
I wouldn't necessarily expect there to be a super-strong connection between not rationalizing and being a "high-level rationalist". There are other ways to go systematically wrong than through goal-directed rationalization. As a possibly overlapping point, your concept of "high-level rationalist" probably sneaks in things like intelligence and knowledge that aren't strictly rationality.
I don't think I rationalize to any significant extent. Even the examples I came up with for Anna's thread concern inefficient allocation of attention and using zero-information arguments, not something specifically directed to defense of a position. I admit being wrong or confused on simple things, sometimes incorrectly (so that I have to go back to embrace a momentarily-rejected position). It's possible I'm completely incapable of noticing rationalization and would need a new basic skill to fix that, but doesn't seem very likely.
(Alternatively, perhaps "rationalization" needs to be unpacked a bit, so that problems like those in the examples I referred to above can find a place in that notion. As it is, they seem more like flaws in understanding unbiased with respect to a favored conclusion, unless that conclusion is to be selected in the hindsight.)
Anyone volunteers to go through Vladimir_Nesov's comments on LW and point out his rationalizations to him?
That could actually be quite helpful. No offense to Vladimir; we're just sincerely curious about this phenomenon, and if he's really a case of someone who doesn't relate to Tarski or rationalization, then it'd be helpful to have good evidence one way or the other about whether he rationalizes.
I feel that I avoid most rationalizing simply by being very comfortable, perhaps too comfortable, with the possibility that my beliefs may be wrong and my decisions suboptimal. Is the design I came up with for my work project any good? Maybe not. Have I made the right career choices? Probably not. Am I living in the right city? Maybe, maybe not. Is my current life plan going to lead me to happiness? Doubtful.
One data point: I notice myself rationalizing, or starting to rationalize, many times a week.
I might task inexpensive virtual assistants (from third-world countries) with finding YouTube clips of people rationalizing on TV (the easiest candidates are probably Fox News people, politicians, etc.)
In a word: compartmentalization.
Since that's not helpful, I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)
Some of these people show signs of being rather
"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean? Are you referring to the ...
I will say that it doesn't even seem to be possible for there to be people who don't rationalize. (Or enough that you're at all likely to find them.)
You'd think not. Yet even Eliezer seems to think that one of our case studies really, truly might not ever rationalize and possibly never has before. This seems to be a case of a beautiful, sane theory beaten to death by a small gang of brutal facts.
"Some", "signs", "rather". These words all show signs of being rather belief in belief. I notice you don't say, "Some of these people are high-level rationalists," just that they show warning signs of being so. What does this really mean?
It means that I don't know how to measure how strong someone's rationality skills are other than talking to others whom I intuitively want to say are good rationalists and comparing notes. So I'm hedging my assertions. But to whatever degree several people at the Singularity Institute are able to figure out who is or is not a reasonably good rationalist, some of our sample "non-rationalizers" appear to us to be good rationalists, and some appear not to be so.
...Also, could you explain what you mean by
Have you actually tested them for rationalizing? My own beliefs are that it's more likely to run into someone who rationalizes so much they are blind to their own rationalizing (and so can't recall any) than someone who is inhumanly honest.
(Tests in this case would include checking for hindsight bias, which is classic rationalizing, and having them do that test on YourMorals whose name I forget where you're given two summaries of studies for and against gun control and asked to criticize them - usually showing imbalance towards your favored side. But you're a LWer, I'm sure you can think of other tests.)
The cues people have for noticing their rationalizations are things they notice before they're done thinking. They have not rationalized; they had a thought that could lead to rationalization or a feeling they associate with rationalizing. And then they stopped. But there was a large enough time between when they started arguing for a conclusion and when they decided to think about it that they noticed their rationalization. Having a reflex to think about a question fast enough compared to the reflex to rationalize can cause someone to not notice their arg...
I've long-since internalized "all stated reasons are post-hoc rationalizations", so I've been gradually losing my ability to pick out "rationalizations" in particular.
That is, when a human answers a query as to their reasons for something, they usually inspect their self-model to guess what course of actions could have led to that outcome (as though predicting a future action). Some guesses are better than others, and we call the bad guesses "rationalizations".
ETA: I wrote this comment before noticing that the cases seem to b...
After reading the comments I noticed that I had at least two distinct mental processes that I'd been labeling "rationalization".
Process 1: Say I'm late for a meeting, I have noticed that in thinking about saying "Sorry, I'm late" I immediately want to add an explanation for why this isn't my fault.
Process 2: Someone presents an argument for a conclusion I disagree with, I immediately start looking for flaws in it/reasons to dismiss it. As I observed here, this is necessarily even a fallacy.
I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).
One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perha...
I tend to agree that anyone who denies the tendency to rationalize is either in denial or has a different definition for the word "rationalize". In fact I would argue that rationalization is the default for human beings, and that anything else requires either focused effort or serious mental re-programming (which is still probably only partially effective).
I absolutely relate. I totally would have said that a week ago. Evidence has smashed my belief's face quite solidly in the nose, though.
One possible way to try to elicit an understanding for any given individual's capacity for rationalization is to ask them about the last time they did something they knew was a bad idea (perhaps a comrpomise they felt uncomfortable making, or an indulgence they knew they were going to regret), and then to ask them what excuses went through their brains to justify it. If someone still denies ever having had such an experience then they are beyond redemption.
That's a good idea, and we did it several times. They sincerely do deny having such experience, but not in a knee-jerk way. It's more like a, "Huh. Hmm. Um... Well, I honestly can't think of something quite like that, but maybe X is similar?" And "X" in this case is something like, "I knew eating a cookie wasn't good for me, but I felt like it and so I did it anyway." It's like the need for justification is just missing, at least in their self-reports.
This reminds me of a bit in The Righteous Mind, where Haidt discusses some of his experiments about moral reasoning. When he asked his university students questions like "is it right or wrong for a man to buy a (dead) chicken from a store and then have sex with it before eating it", the students had no problem providing a long list of various justifications pro or con, and generally ending up with an answer like “It’s perverted, but if it’s done in private, it’s his right”. In contrast, when Haidt went to a local McDonalds to ask working-class people the same questions, he tended to get odd looks when he asked them to explain why they thought that the chicken scenario was wrong.
Haidt puts this down to the working-class people having an additional set of moral intuitions, ones where e.g. acts violating someone's purity are considered just as self-evidently bad as acts causing somebody needless pain, and therefore denouncing them as wrong needs no explanation. But I wonder if there's also a component of providing explicit reasons for your actions or moral judgements being to some extent a cultural thing. If there are people who are never asked to provide justifications for ...
But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for.
Don't you have exercises designed to catch people rationalizing? If not, you ought to, if yes, did you catch them rationalizing?
Maybe you could give some examples of the sort of rationalizations you're referring to in your post, so we would better know how to answer your question? I think I might fall into this category, but I might not. I frequently think it would be a good idea for me to do something, but I don't do it and tell myself I lack the necessary psychological strength. Is this rationalizing? Also, I sometimes experience ugh fields around learning things that might be uncomfortable (in this sense a student might be afraid to see what score they got on a test).
I don't cla...
I don't notice myself rationalising much at all.
My hypothesis is that I am rationalising and I have not picked up the skill to detect it. Which is confusing, because I regularly interrogate myself and look at my beliefs and such to find irrationalities, but I havn't found any.
Am I doing it wrong? Or am I unussually rational? Placing higher probability on doing it wrong feels like fake humility, but I think its accurate.
I'm having a hard time remembering rationalizing bad decisions, but I'm having an easy time remembering rationalizing bad outcomes. That may be a useful dichotomy to explore.
I think this general phenomenon may have something to do with verbal thinking as suggested below, but I'm not sure that applies to my case. I think I came to terms with my id getting to make a lot of my decisions- and so the primary stated justification is something like "I desire to relax now" rather than "I deserve to relax now," and the superego is just outvoted ...
I feel like I can relate to that. It's not like I never rationalize, but I always know when I do it. Sometimes It may be pretty faint, but I'll still be aware of it. Whether I allow myself to proceed with justifying a false belief depends on the context. Sometimes it just feels uncomfortable enough to admit to being wrong, sometimes it is efficient to mislead people, and so on.
What is rationalization? To me, it feels like a lower-level, more primitive part of the brain recruiting the verbal centres in an attempt to persuade the higher level part of the brain to do something short-sighted. Perhaps these people are unusually non-conflicted - for example, their their lower levels may have a lower-than-usual time preference, or their higher levels may be too weak to get in the way in the first place.
(I keep wanting to say "id" and "super-ego" here despite knowing that Freud isn't scientific. Are there better terms?).
I'm kinda confused, when people say things like "I'm trying to give up chocolate. Last weekend I saw a delicious cake and I found myself telling myself the only reason I wanted it was to boost my energy levels, hahaha you know the feeling, right?" they don't really believe that, right? I mean is they know the entire time they're breaking away from their 'ideal self' or 'should-be self' and just say things like that as a kind of mock-explanation to fulfill social expectations.
ETA: Whoa, typing this as a stream of thought didn't help me grasp how long the comment was becoming! ETA2: To be clear, I recognize the difference between not doing something and not being aware you're doing something.
I missed this thread when it was originally posted, but anyway...
I'm going to try something that has helped me in the past with these sorts of things. I'm going to write my thoughts as they occur. I've found this helps other peek into my mental state a bit.
Of all the examples of rationalization in this thread, I have no recollection of d...
I haven't remembered a dream in years. There are three that I have had in my life which I can recount even a bit of (all of which were nightmares, interestingly). I'm pretty sure that I have them all the time because I sometimes wake up with strange images in my head. But these images disappear very quickly and I can't tell someone what I was dreaming about even minutes after waking.
I notice that I sometimes catch myself rationalizing in simple ways, like offering some justification for a shortcoming that I have. But I notice also that I can only think of ...
"I have goofed" is hardly a rationalization, is it?
Or "I did this, because all the elephants are flying." is not making an excuse, IF you really believe that they are indeed flying - either. No matter that at least some elephants are not flying. You just have a wrong belief.
A rationalization is (in the sense of "making excuses"), when you are rationalizing with a knowingly wrong reason.
Would you call THIS comment "a rationalization"?
Anna Salamon and I are confused. Both of us notice ourselves rationalizing on pretty much a daily basis and have to apply techniques like the Litany of Tarski pretty regularly. But in several of our test sessions for teaching rationality, a handful of people report never rationalizing and seem to have little clue what Tarski is for. They don't relate to any examples we give, whether fictitious or actual personal examples from our lives. Some of these people show signs of being rather high-level rationalists overall, although some don't.