Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

We Change Our Minds Less Often Than We Think

37 Post author: Eliezer_Yudkowsky 03 October 2007 06:14PM

"Over the past few years, we have discreetly approached colleagues faced with a choice between job offers, and asked them to estimate the probability that they will choose one job over another.  The average confidence in the predicted choice was a modest 66%, but only 1 of the 24 respondents chose the option to which he or she initially assigned a lower probability, yielding an overall accuracy rate of 96%."
       —Dale Griffin and Amos Tversky, "The Weighing of Evidence and the Determinants of Confidence."  (Cognitive Psychology, 24, pp. 411-435.)

When I first read the words above—on August 1st, 2003, at around 3 o'clock in the afternoon—it changed the way I thought.  I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided.  We change our minds less often than we think.  And most of the time we become able to guess what our answer will be within half a second of hearing the question.

How swiftly that unnoticed moment passes, when we can't yet guess what our answer will be; the tiny window of opportunity for intelligence to act.  In questions of choice, as in questions of fact.

The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist.  Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.

You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn't justify it, reject it.

But we change our minds less often—much less often—than we think.

I'm sure that you can think of at least one occasion in your life when you've changed your mind.  We all can.  How about all the occasions in your life when you didn't change your mind?  Are you they as available, in your heuristic estimate of your competence?

Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.

 

Part of the Seeing With Fresh Eyes subsequence of How To Actually Change Your Mind

Next post: "Hold Off On Proposing Solutions"

Previous post: "How to Seem (and Be) Deep"

Comments (115)

Sort By: Old
Comment author: Doug_S. 03 October 2007 07:57:32PM 11 points [-]

I hate changing my mind based on my parents' advice because I want to demonstrate that I'm capable of making good decisions on my own, especially since we seem to disagree on some fundamental values. Specifically, they love their jobs and put a moral value on productivity, while my goal in life is to "work" as little as possible and have as much "fun" as possible.

Comment author: notsonewuser 04 September 2013 02:59:19PM 1 point [-]

I hate changing my mind based on my parents' advice because I want to demonstrate that I'm capable of making good decisions on my own...

Eliezer never said to change your mind based on wrong advice! However, if you feel as if you should be following your parents' advice, perhaps you should question exactly how capable you really are (at the moment).

Comment author: Felix2 03 October 2007 10:54:59PM 1 point [-]

Does this mean that if we cannot remember ever changing our minds, our minds are very good at removing clutter?

Or, consider a question that you've not made up your mind on: Does this mean that you're most likely to never make up your mind?

And, anyway, in light of those earlier posts concerning how well people estimate numeric probabilities, should it be any wonder that 66% = 96%?

Comment author: DanielLC 05 September 2010 06:53:15PM 4 points [-]

Don't they normally make them more certain? Like, if they're 96% sure, there's a 66% chance that they're right, rather than the other way around?

Comment author: Adirian 03 October 2007 11:06:11PM 0 points [-]

Not to argue, but to point out, that this is not necessarily a bad thing. It depends entirely on the basis of one's conclusion. Gut instincts are quite often correct about things we have no conscious evidence for - because our unconscious does have pretty good evidence filters. Which is one of the reasons I suggested rationalization is not necessarily a bad thing, as it can be used to construct a possible rational basis for conceptualizations developed without conscious thought, thus permitting us to judge the merit of those ideas.

Comment author: Constant2 03 October 2007 11:58:46PM 2 points [-]

Here is one way to change your mind. Think through something carefully, relying on strong connections. You may at some point walk right into a conclusion that contradicts a previous opinion. At this point something will give. The strength of this method is that it is strengthened by the very attachment to your ideas that it undermines. The more stubborn you are, the harder you push against your own stubbornness.

Comment author: Senthil 04 October 2007 04:09:45AM 0 points [-]

I agree with Adirian that not changing our minds is not necessarily a bad thing.

The problem, I guess, like with most things is we can't be sure which way to go. Gut feelings are often quite correct. But how do we know when we are having a bias which is not good for us and when it's a gut feeling? Gut feelings inherently aren't questionable. Biases need to be kept in check.

If we run through the standard biases and logical fallacies like a checklist and what we think doesn't fall in any of them, we can go with our gut instinct. Else, give whatever we have in mind a second thought. What we do may not be foolproof but it at least takes us in a direction which would makes changing our minds, when required, a less painful process.

Comment author: michael_vassar3 04 October 2007 05:18:29AM 8 points [-]

It probably doesn't help to live in a society where changing one's positions in response to evidence is considered "waffling", and is considered to show a lack of conviction.

Divorce is a lot more common than 4%, so people do admit mistakes when given enough evidence.

Comment author: Viliam_Bur 29 October 2011 11:38:34AM 2 points [-]

Changing your mind or "updating" is not necessarily a sign of rationality. You could also update for wrong reasons.

For example, a divorce can happen when a person has unrealistic expectations on marriage. Updating their beliefs about their partner would be just a side effect of refusing to update their beliefs about marriage.

Also, in some cases, the divorce could have been planned since the beginning (for example for financial gain), so it actually did not include a change of mind.

Comment author: Peterdjones 29 October 2011 01:56:46PM *  4 points [-]

I think the embargo on mind-changing is a special case for politiicians: after all, if they say one thing on the hustings, and then do another in office, that makes a mockery of democracy. However, if it is applied to non-pliticians, that would be fallacious.

Comment author: sparkles 17 February 2013 07:12:07PM *  2 points [-]

If they say one thing and intend to do another, sure - but if they actually update? That may be bad PR, but I don't think it's undemocratic.

Comment author: Peterdjones 27 February 2013 11:57:04PM -2 points [-]

If you can;t rely on politicians to do something like what hey said they were going to, what's the point in voting? ideally, a pl who has a change of heart should stand for re-election.

Comment author: wedrifid 28 February 2013 07:33:39PM 2 points [-]

If you can;t rely on politicians to do something like what hey said they were going to, what's the point in voting?

You could have a prediction about what they respectively will do and have a preference over those outcomes.

Comment author: Peterdjones 03 March 2013 02:14:56PM *  0 points [-]

So if they ruin the economy, and I successfully predict that, I smile and collect my winnings?

Comment author: Kindly 03 March 2013 06:26:49PM 1 point [-]

Presumably if you can predict that Candidate A will ruin the economy, then you vote for Candidate B instead.

Unless you can think of a way of winning by having advance knowledge that the economy will be ruined, which will net you greater gain than having an un-ruined economy would be. Then you may selfishly vote for Candidate A.

I'm ignoring here the question of how much your opinion influences the outcome of the election, of course. Also if you end up predicting that all the candidates will ruin the economy equally, you don't have much of a decision to make.

Comment author: Peterdjones 03 March 2013 06:36:12PM -1 points [-]

Presumably if you can predict that Candidate A will ruin the economy, then you vote for Candidate B instead.

I can only predict what will happen on the basis that a) their policies will have a certain effect and b) they will actually implement their policies. Which gets back to the original point: if they are not going to do what they say, what is the point of voting?

Comment author: Kindly 03 March 2013 08:30:47PM 0 points [-]

I think I agree. I also think wedrifid wanted to talk about predictions of what the candidates do, even if they are not guaranteed not to change their mind.

This doesn't seem impossible, just harder. You'd have to make a guess as to how likely the candidates are to implement a different policy from the one they promised, as well as the effect the possible policies will have.

The candidates do have an incentive to signal that they are unlikely to "waffle". If you are relatively certain to implement your policies, then at least those who agree with you will predict that you'll have a good effect. If you look like you might change your mind, even your supporters might decide to take a different option, because who knows what you will do?

In theory, you might gain a bigger advantage by somehow signaling that you will change your mind for good reasons. Then if new information comes up in the future, you're a better choice than anyone who promises not to change their mind at all. But this is trickier and less convincing.

Comment author: wedrifid 04 March 2013 04:33:38AM 0 points [-]

I can only predict what will happen on the basis that a) their policies will have a certain effect and b) they will actually implement their policies.

That seems to be a significant limitation.

Which gets back to the original point: if they are not going to do what they say, what is the point of voting?

Fortunately, not everybody has said limitation.

Comment author: wedrifid 04 March 2013 03:27:59AM 2 points [-]

So if they ruin the economy, and I successfully predict that, I smile and collect my winnings?

Both candidates being likely to successfully manage to ruin the economy is a problem quite distinct from politicians lying.

Comment author: Izeinwinter 03 March 2013 07:10:40PM 0 points [-]

You misrepresent democracy very badly in the above post. Politicians are not agents of the voters, they are representatives of them, appointed by, and accountable to the demos, but not a mirror of it- they are not supposed to enact the policies voters thought appropriate 2 years ago at the polls, or what polls well today. They are supposed to do what the voters would want done if they had time to research the issue and give it some thought, incorporating all data about the present situation. If policy was supposed to reflect the averaged will of the people politicians would be entirely redundant and we could just do lawmaking by popular initiative.

Comment author: Peterdjones 03 March 2013 07:23:11PM 1 point [-]

Of course it is unworkable for politicians to stick rigidly to their manifestos. It is also unworkable for them to discard their manifestos on day one.

Comment author: Tony 04 October 2007 02:53:19PM 5 points [-]

I wonder if the act of answering the question actually causes the decision to firm up. Kind of the OvercomingBias Uncertainty Principle.

Comment author: Robin_Hanson2 04 October 2007 03:03:24PM 0 points [-]

It is nice to have a clear example of where people are consistently underconfident. Are there others? Michael, good point about divorce.

Comment author: Richard_Hollerith 04 October 2007 03:11:59PM 0 points [-]

I second Robin's question.

Comment author: Richard_Hollerith 04 October 2007 04:03:58PM 0 points [-]

I'd also like to learn whether the experimental finding holds for a wide variety of decisions. (Eliezer mentioned only picking a job offer.)

Comment author: Senthil 04 October 2007 05:10:18PM 0 points [-]

Aren't people consistently underconfident when it comes to their money? Everybody does something, invest in something, but aren't really sure about it even after they've done it. It's in its most extreme when it comes to the stock market.

Another instance is when people approach members of the opposite sex who they think are attractive. They consistently misunderestimate themselves.

Otherwise it depends on what their used to, like people in technology are underconfident when it comes to negotiation and so forth.

Comment author: Rick_Smith 04 October 2007 05:11:39PM 3 points [-]

In the case of Divorce, the reasons cannot always be taken as evidence for the marriage having been a mistake to begin with.

Things happen and people change.

Comment author: The_Decision_Strategist 04 October 2007 05:14:42PM 0 points [-]

This is an interesting idea and doesn't surprise me given thin-slicing behavior and the like. But the research itself seems a little thin. Where is the actual testing versus a control group? What about other decisions that don't involve jobs?

Also, I think probably we know what we will choose 99% of the time because we make the decision instantaneously. The real question is whether we do this even on decisions that we don't consciously know what we are going to choose. Are we as accurate in those decisions?

Comment author: Eliezer_Yudkowsky 05 October 2007 04:06:50PM 5 points [-]

It is nice to have a clear example of where people are consistently underconfident. Are there others?

People tend to take into account the magnitude of evidence (how extreme is the value?) while ignoring its reliability, and they also tend to be bad at combining multiple pieces of evidence. So another good way to generate underconfidence is to give people lots of small pieces of reliable evidence. (I believe it's in the same paper, "The Weighing of Evidence and the Determinants of Confidence".)

Comment author: bloix 07 October 2007 02:28:21AM 7 points [-]

I recall having an argument over dinner with a friendly acquaintance about an unimportant but interesting problem. I thought about it for few days and decided he was right. I've hated him ever since.

Comment author: DanielLC 05 September 2010 06:55:05PM 5 points [-]

And now we're curious. What was the problem?

Comment author: Marius_Gedminas 08 June 2008 01:16:16AM 2 points [-]

Are you they as available, in your heuristic estimate of your competence?

I'm unable to parse this sentence.

Comment author: bigjeff5 15 February 2011 06:04:21AM 2 points [-]

Drop the "you" and see the linked "Availability Heuristic".

Comment author: suecochran 10 April 2011 10:41:08PM *  3 points [-]

I used to have a button that said "If you haven't changed your mind lately, how do you know you've still got one?" I really liked that sentiment.

It's very easy to get comfortable with our opinions and beliefs, and uncomfortable about any challenge to them. As I've posted elsewhere, we often identify our "selves" with our "beliefs", as if they "were" us. Once we can separate our idea of "self" as different from "that which our self currently believes", it becomes easier to entertain other thoughts, and challenges from others, to our beliefs and opinions. If we are comfortable and secure in our own selves, then we can discuss dispassionately the ideas that contradict what we have previously held to be true. It is the only way that we can learn, that we can take in new and different ideas without that being a blow to our ego. Identifying our selves with our thoughts, opinions, beliefs, blocks us, threatens us, so that we get stuck with our old ways of doing things and framing things, and we don't grow and change with ease.

Comment author: Martok 08 April 2012 10:42:08PM 4 points [-]

A lot of people probably already know that, it's a familiar "deep wisdom", but anyway: you can use this not-changing of your mind to help you with seemingly complicated decisions that you ponder over for days. Simply assign the possible answers and flip a coin (or roll a dice, if you need more than 2). It doesn't matter what the result is, but depending on wether it matches your already-made decision you will either immediately reject the coin's "answer" or not. That tells you what your first decision was, unclouded by any attempts to justify the other option(s).

Now, if you've trained your intuition (aka have the right set of Cached Thoughts), that answer will be the correct or better one. Or, as has happened to me more than once, you realize that both alternatives are actually wrong and your mind already came up with a better solution.

Comment author: DaFranker 01 August 2012 04:08:59PM *  0 points [-]

Without knowing the terms or technical explanation for it, this is what I have always been doing automatically for as long as I can remember making decisions conciously (generously apply confidence margin and overconfidence moderation proportional to applicable biases). However, upon reading the sequences here, I realize that several problems I have identified in my thought strategies actually stem from my reliance on training my intuition and subconscious for what I now know to be simply better cached thoughts.

It turns out that no matter how well you organize and train your Caches and other automatic thinking, belief-forming and decision-making processes, some structural human biases are virtually impossible to eliminate by strictly relying on this method. What's more, by having relied on this for so long, I find myself having even more difficulty training my mind to think better.

Comment author: PerennialChild 15 June 2012 04:46:20PM 2 points [-]

That's true. Matters are not helped by the value society places on commitment and consistency. When we do, in fact, change our minds, we are more often than not labeled as "wishy-washy," or some similarly derogatory term.

Comment author: ictoan 05 November 2012 06:54:13PM *  -1 points [-]

This article reminds me of the movie "Inception"... once an idea is planted it is hard to get it out.

Comment author: DiamondSoul 09 July 2014 04:04:09PM 0 points [-]

As Eliezer says, on short time scales (days, weeks, months) we change our minds less often than we expect to. However, it's worth noting that, on larger time scales (years, decades) the opposite seems to be true. Also, our emotional state changes more frequently than we expect it to, even on short time scales. I can't seem to recall my exact source on this second point at the moment (I think it was some video we watched in my high school psychology class), though, anecdotally, I've observed it to be true in my own life. Like, when I'm feeling good, I may think thoughts like "I'm a generally happy person", or "my current lifestyle is working very well, and I should not change it", which are falsifiable claims/predictions that are based on the highly questionable assumption that my current emotional state will persist into both the near and distant future. Similarly, I may think the negations of such thoughts when I'm feeling bad. As a result, I have to remind myself to be extra skeptical/critical of falsifiable claims/predictions that agree too strongly my current emotional state.