I hate changing my mind based on my parents' advice because I want to demonstrate that I'm capable of making good decisions on my own, especially since we seem to disagree on some fundamental values. Specifically, they love their jobs and put a moral value on productivity, while my goal in life is to "work" as little as possible and have as much "fun" as possible.
Does this mean that if we cannot remember ever changing our minds, our minds are very good at removing clutter?
Or, consider a question that you've not made up your mind on: Does this mean that you're most likely to never make up your mind?
And, anyway, in light of those earlier posts concerning how well people estimate numeric probabilities, should it be any wonder that 66% = 96%?
Not to argue, but to point out, that this is not necessarily a bad thing. It depends entirely on the basis of one's conclusion. Gut instincts are quite often correct about things we have no conscious evidence for - because our unconscious does have pretty good evidence filters. Which is one of the reasons I suggested rationalization is not necessarily a bad thing, as it can be used to construct a possible rational basis for conceptualizations developed without conscious thought, thus permitting us to judge the merit of those ideas.
Here is one way to change your mind. Think through something carefully, relying on strong connections. You may at some point walk right into a conclusion that contradicts a previous opinion. At this point something will give. The strength of this method is that it is strengthened by the very attachment to your ideas that it undermines. The more stubborn you are, the harder you push against your own stubbornness.
I agree with Adirian that not changing our minds is not necessarily a bad thing.
The problem, I guess, like with most things is we can't be sure which way to go. Gut feelings are often quite correct. But how do we know when we are having a bias which is not good for us and when it's a gut feeling? Gut feelings inherently aren't questionable. Biases need to be kept in check.
If we run through the standard biases and logical fallacies like a checklist and what we think doesn't fall in any of them, we can go with our gut instinct. Else, give whatever we have in...
It probably doesn't help to live in a society where changing one's positions in response to evidence is considered "waffling", and is considered to show a lack of conviction.
Divorce is a lot more common than 4%, so people do admit mistakes when given enough evidence.
I wonder if the act of answering the question actually causes the decision to firm up. Kind of the OvercomingBias Uncertainty Principle.
It is nice to have a clear example of where people are consistently underconfident. Are there others? Michael, good point about divorce.
I'd also like to learn whether the experimental finding holds for a wide variety of decisions. (Eliezer mentioned only picking a job offer.)
Aren't people consistently underconfident when it comes to their money? Everybody does something, invest in something, but aren't really sure about it even after they've done it. It's in its most extreme when it comes to the stock market.
Another instance is when people approach members of the opposite sex who they think are attractive. They consistently misunderestimate themselves.
Otherwise it depends on what their used to, like people in technology are underconfident when it comes to negotiation and so forth.
In the case of Divorce, the reasons cannot always be taken as evidence for the marriage having been a mistake to begin with.
Things happen and people change.
This is an interesting idea and doesn't surprise me given thin-slicing behavior and the like. But the research itself seems a little thin. Where is the actual testing versus a control group? What about other decisions that don't involve jobs?
Also, I think probably we know what we will choose 99% of the time because we make the decision instantaneously. The real question is whether we do this even on decisions that we don't consciously know what we are going to choose. Are we as accurate in those decisions?
It is nice to have a clear example of where people are consistently underconfident. Are there others?
People tend to take into account the magnitude of evidence (how extreme is the value?) while ignoring its reliability, and they also tend to be bad at combining multiple pieces of evidence. So another good way to generate underconfidence is to give people lots of small pieces of reliable evidence. (I believe it's in the same paper, "The Weighing of Evidence and the Determinants of Confidence".)
I recall having an argument over dinner with a friendly acquaintance about an unimportant but interesting problem. I thought about it for few days and decided he was right. I've hated him ever since.
Are you they as available, in your heuristic estimate of your competence?
I'm unable to parse this sentence.
I used to have a button that said "If you haven't changed your mind lately, how do you know you've still got one?" I really liked that sentiment.
It's very easy to get comfortable with our opinions and beliefs, and uncomfortable about any challenge to them. As I've posted elsewhere, we often identify our "selves" with our "beliefs", as if they "were" us. Once we can separate our idea of "self" as different from "that which our self currently believes", it becomes easier to entertain other thoughts...
A lot of people probably already know that, it's a familiar "deep wisdom", but anyway: you can use this not-changing of your mind to help you with seemingly complicated decisions that you ponder over for days. Simply assign the possible answers and flip a coin (or roll a dice, if you need more than 2). It doesn't matter what the result is, but depending on wether it matches your already-made decision you will either immediately reject the coin's "answer" or not. That tells you what your first decision was, unclouded by any attempts to j...
That's true. Matters are not helped by the value society places on commitment and consistency. When we do, in fact, change our minds, we are more often than not labeled as "wishy-washy," or some similarly derogatory term.
This article reminds me of the movie "Inception"... once an idea is planted it is hard to get it out.
As Eliezer says, on short time scales (days, weeks, months) we change our minds less often than we expect to. However, it's worth noting that, on larger time scales (years, decades) the opposite seems to be true. Also, our emotional state changes more frequently than we expect it to, even on short time scales. I can't seem to recall my exact source on this second point at the moment (I think it was some video we watched in my high school psychology class), though, anecdotally, I've observed it to be true in my own life. Like, when I'm feeling good, I m...
I would say that the study by Griffin and Tversky is incomplete. The way I see it, we have an inner "scale" of the validity of evidence and decide based on that. As was pointed out in one of the previous posts, we should bet on an event 100% of the time if the event is more likely than the alternatives. Something similar is happening here, where if we are more than 50% sure that job A is better than job B, we should pick job A. Given that the participants were 66% sure, this would mean that there is a low a priori probability for them to change their minds...
BLUF: The cited paper doesn't support the claim that we change our minds less often than we think, and overall it and a paper it cites point the other way. A better claim is that we change our minds less often than we should.
The cited paper is freely downloadable: The weighing of evidence and the determinants of confidence. Here is the sentence immediately following the quote:
...It is noteworthy that there are situations in which people exhibit overconfidence even in predicting their own behavior (Vallone, Griffin, Lin, & Ross, 1990). The key variable,
When I first read the words above—on August 1st, 2003, at around 3 o'clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can't yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn't justify it, reject it.
But we change our minds less often—much less often—than we think.
I'm sure that you can think of at least one occasion in your life when you've changed your mind. We all can. How about all the occasions in your life when you didn't change your mind? Are you they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.