I hate changing my mind based on my parents' advice because I want to demonstrate that I'm capable of making good decisions on my own, especially since we seem to disagree on some fundamental values. Specifically, they love their jobs and put a moral value on productivity, while my goal in life is to "work" as little as possible and have as much "fun" as possible.
Does this mean that if we cannot remember ever changing our minds, our minds are very good at removing clutter?
Or, consider a question that you've not made up your mind on: Does this mean that you're most likely to never make up your mind?
And, anyway, in light of those earlier posts concerning how well people estimate numeric probabilities, should it be any wonder that 66% = 96%?
Not to argue, but to point out, that this is not necessarily a bad thing. It depends entirely on the basis of one's conclusion. Gut instincts are quite often correct about things we have no conscious evidence for - because our unconscious does have pretty good evidence filters. Which is one of the reasons I suggested rationalization is not necessarily a bad thing, as it can be used to construct a possible rational basis for conceptualizations developed without conscious thought, thus permitting us to judge the merit of those ideas.
Here is one way to change your mind. Think through something carefully, relying on strong connections. You may at some point walk right into a conclusion that contradicts a previous opinion. At this point something will give. The strength of this method is that it is strengthened by the very attachment to your ideas that it undermines. The more stubborn you are, the harder you push against your own stubbornness.
I agree with Adirian that not changing our minds is not necessarily a bad thing.
The problem, I guess, like with most things is we can't be sure which way to go. Gut feelings are often quite correct. But how do we know when we are having a bias which is not good for us and when it's a gut feeling? Gut feelings inherently aren't questionable. Biases need to be kept in check.
If we run through the standard biases and logical fallacies like a checklist and what we think doesn't fall in any of them, we can go with our gut instinct. Else, give whatever we have in...
It probably doesn't help to live in a society where changing one's positions in response to evidence is considered "waffling", and is considered to show a lack of conviction.
Divorce is a lot more common than 4%, so people do admit mistakes when given enough evidence.
I wonder if the act of answering the question actually causes the decision to firm up. Kind of the OvercomingBias Uncertainty Principle.
It is nice to have a clear example of where people are consistently underconfident. Are there others? Michael, good point about divorce.
I'd also like to learn whether the experimental finding holds for a wide variety of decisions. (Eliezer mentioned only picking a job offer.)
Aren't people consistently underconfident when it comes to their money? Everybody does something, invest in something, but aren't really sure about it even after they've done it. It's in its most extreme when it comes to the stock market.
Another instance is when people approach members of the opposite sex who they think are attractive. They consistently misunderestimate themselves.
Otherwise it depends on what their used to, like people in technology are underconfident when it comes to negotiation and so forth.
In the case of Divorce, the reasons cannot always be taken as evidence for the marriage having been a mistake to begin with.
Things happen and people change.
This is an interesting idea and doesn't surprise me given thin-slicing behavior and the like. But the research itself seems a little thin. Where is the actual testing versus a control group? What about other decisions that don't involve jobs?
Also, I think probably we know what we will choose 99% of the time because we make the decision instantaneously. The real question is whether we do this even on decisions that we don't consciously know what we are going to choose. Are we as accurate in those decisions?
It is nice to have a clear example of where people are consistently underconfident. Are there others?
People tend to take into account the magnitude of evidence (how extreme is the value?) while ignoring its reliability, and they also tend to be bad at combining multiple pieces of evidence. So another good way to generate underconfidence is to give people lots of small pieces of reliable evidence. (I believe it's in the same paper, "The Weighing of Evidence and the Determinants of Confidence".)
I recall having an argument over dinner with a friendly acquaintance about an unimportant but interesting problem. I thought about it for few days and decided he was right. I've hated him ever since.
Are you they as available, in your heuristic estimate of your competence?
I'm unable to parse this sentence.
I used to have a button that said "If you haven't changed your mind lately, how do you know you've still got one?" I really liked that sentiment.
It's very easy to get comfortable with our opinions and beliefs, and uncomfortable about any challenge to them. As I've posted elsewhere, we often identify our "selves" with our "beliefs", as if they "were" us. Once we can separate our idea of "self" as different from "that which our self currently believes", it becomes easier to entertain other thoughts...
A lot of people probably already know that, it's a familiar "deep wisdom", but anyway: you can use this not-changing of your mind to help you with seemingly complicated decisions that you ponder over for days. Simply assign the possible answers and flip a coin (or roll a dice, if you need more than 2). It doesn't matter what the result is, but depending on wether it matches your already-made decision you will either immediately reject the coin's "answer" or not. That tells you what your first decision was, unclouded by any attempts to j...
That's true. Matters are not helped by the value society places on commitment and consistency. When we do, in fact, change our minds, we are more often than not labeled as "wishy-washy," or some similarly derogatory term.
This article reminds me of the movie "Inception"... once an idea is planted it is hard to get it out.
As Eliezer says, on short time scales (days, weeks, months) we change our minds less often than we expect to. However, it's worth noting that, on larger time scales (years, decades) the opposite seems to be true. Also, our emotional state changes more frequently than we expect it to, even on short time scales. I can't seem to recall my exact source on this second point at the moment (I think it was some video we watched in my high school psychology class), though, anecdotally, I've observed it to be true in my own life. Like, when I'm feeling good, I m...
I would say that the study by Griffin and Tversky is incomplete. The way I see it, we have an inner "scale" of the validity of evidence and decide based on that. As was pointed out in one of the previous posts, we should bet on an event 100% of the time if the event is more likely than the alternatives. Something similar is happening here, where if we are more than 50% sure that job A is better than job B, we should pick job A. Given that the participants were 66% sure, this would mean that there is a low a priori probability for them to change their minds...
BLUF: The cited paper doesn't support the claim that we change our minds less often than we think, and overall it and a paper it cites point the other way. A better claim is that we change our minds less often than we should.
The cited paper is freely downloadable: The weighing of evidence and the determinants of confidence. Here is the sentence immediately following the quote:
...It is noteworthy that there are situations in which people exhibit overconfidence even in predicting their own behavior (Vallone, Griffin, Lin, & Ross, 1990). The key variable,
I think we're looking at different dictionaries, so I'll abandon the word impulsive and try with a more object-level phrase.
Hilarious, the point you have abandoned has +2, whilst my point that forced the abandoning still has -1. anyways...
They can drive less carefully while maintaining the same beliefs about risk.
and if those same beliefs are already an underestimation of risk? strike 1, just clipped the outside of the plate.
Let's unpack that last quote in context of driving... a-yawn-gain. they can drive less carefully. Less carefully, is about less care - what is "care", that's about
Care = Feel concern or interest; attach importance to something: "they don't care about human life". (dictionary.com)
So they feel less concern, they attach less importance to driving. What's the key word there, hmmm? "Less" well that's a term, in context that goes with "under"-estimate. Do you think? I do. Strike 2 - straight up the middle of the plate. Batter says, I didn't see that. Too bad says the ref.
Let's examine the opposite side, to include a process for minimising disconfirmation bias. They drive less carefully. Ok, I'm flipping my brain. The less carefully has nothing to do with underestimating risk, actually in this flip it's about overestimating risk... why do I say overestimate - well apparently that's part of the argument opposing my viewpoint, check above.
Well what's the dictionary say about what "Over estimate" means
o·ver·es·ti·mate/ˌōvərˈestəˌmāt/ Verb:
Estimate (something) to be better, larger, or more important than it really is. (dictionary.com)
hang on, hang on - overestimate = estimate something to be more important than it really is. Does overestimate sound at all like "less care"? No it doesn't, contradiction found, conclusion is Driving less carefully is about underestimating risk. Strike 3. Yer outta here!
Now, here's a thing. When the teenagers judge the reward highly, sufficiently highly to outweigh the risk of death - they have underestimated the risk. Perception of reward and risk are not in opposition, they go hand in hand.
Now let's look at the rest of the sentence.
They can drive less carefully while maintaining the same beliefs about risk.
The implication in context, is that it's reward driving the behaviour, supposedly being the entire reason for the behaviour, one significant context of the reward perception was peer involvement (see article). Let's try that one.
They can drive less carefully with more people in the car, while maintaining the same beliefs about risk, because they perceive the rewards are higher.
That fits the counter argument to my viewpoint... but hang on, now with more people in the car the risk of death is multiplied. So factually the risk has increased - yet the behaviour is supposedly all due to the reward, now if the behaviour is truly all to do with the reward, then yep the teen has discounted the risk - for the risk increased and it's not changing the behaviour.
So in that situation we've got another example where a teen has underestimated the risk due to a perception of a higher reward.
Am I being too anecdotal for you guys? Of course, discount outgroup behaviour whilst permitting the same ingroup. The article is itself filled with anecdotes... maybe we should just dismiss the entire article... stop press no no, don't do that there's no counter to my op then, lets just pick and choose the parts of it that support the counter, dismiss those that don't - both in the research and the anecdotes.
Please by all means, chuck up the -1, I'm considering them badges of honour now.
Parts of the parent comment that are particularly wrong:
Hilarious, the point you have abandoned has +2, whilst my point that forced the abandoning still has -1. anyways...
paper-machine fairly well handled that one in terms of "Rule 1 of karma is you do not talk about karma". Also, it was not a point that was abandoned, but a word. It is a common technique here to taboo a word whose definition is under dispute, since arguing about definitions is a waste of time.
Since you do not seem to understand, what happened there is that your 'unpacking'...
When I first read the words above—on August 1st, 2003, at around 3 o’clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can’t yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn’t justify it, reject it.
But we change our minds less often—much less often—than we think.
I’m sure that you can think of at least one occasion in your life when you’ve changed your mind. We all can. How about all the occasions in your life when you didn’t change your mind? Are they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera, et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it’s probably going to stay there.
1Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” Cognitive Psychology 24, no. 3 (1992): 411–435.