When you hear powerful evidence or arguments that should get you to revise your beliefs, not only do all sorts of cognitive biases fight the changes but so do the social factors of status and face saving. Perhaps I've long been a vocal proponent of X which implies Y, and you show me that Y isn't always true. It's very hard to just straight up admit "ok, I'm not a hardcore Xist anymore." There's a status loss in letting yourself be convinced.

For a long time I thought that I was stronger than this, that saving face only mattered as much as I let it matter. I wish I could freely admit when I've been convinced, but I often can't manage to. [1] Instead I'll finish a conversation defending my earlier beliefs and only later start acting on my new ones.

After a discussion where someone didn't admit to any change of mind, I'll often see them later having changed their behavior. So now if I'm trying to persuade someone I don't focus on securing verbal agreement. Instead I just try to be as convincing as possible, and notice if they come around later. [2]

(I also posted this on my blog)


[1] This is not a helpful trait: I'd like other people to let me know when I'm wrong or when they have evidence I'm not considering, but if they never get the satisfaction of knowing they've convinced me they may just feel like they've wasted their time, and not try in the future. So I'm working on it.

[2] Keeping people from feeling personally invested in one side or the other of an argument is probably also helpful: I understand discussions are much more likely to convince bystanders than participants.

Mentioned in
New Comment
19 comments, sorted by Click to highlight new comments since: Today at 4:59 PM

I understand discussions are much more likely to convince bystanders than participants.

I've always seen this as the main purpose of conducting a discussion in public. The real target is the audience.

Group social norms about the appropriate status changes applied to someone who changes their mind are important. Personally I respect someone more if I notice they change their mind publicly, but I'm sure I'm in the minority, and it was only through deliberate self modification that I acquired this trait.

Also, a classic overcoming bias blog post on these topics:

http://www.overcomingbias.com/2008/06/overcoming-disa.html

Personally I respect someone more if I notice they change their mind publicly, but I'm sure I'm in the minority, and it was only through deliberate self modification that I acquired this trait.

Notice, too, that public figures with a reputation for frequently changing their minds tend to be the subject of mockery and ridicule even in academic circles. When, for instance, the name of Hilary Putnam comes up in philosophy discussions, it is not uncommon to read or hear a sarcastic comment noting that he has successively embraced and abandoned several different positions on a given question over the course of his career.

I take it as a rule to admit being wrong or confused frequently (when that happens), including on insignificant matters where correctness is of no consequence. This sets up a convention where it's seen as normal for me to admit being wrong, which makes admitting of errors on more important occasions feel more natural, and allows (in discussion) focusing more on the implications of updated beliefs, rather than on the fact of someone having been wrong.

Abandoning your previous position can also be a way of saving face, in at least two ways:

  • Being wrong is embarrassing, yes; but being wrong for a short period of time is less embarrassing than being wrong for an extended period of time. Best to stop the bleeding as quickly as possible.

  • You signal that you are a reasonable person who does not let emotional attachment to a position cloud his judgment. If you're dealing with someone of higher status, you show that your mistake doesn't matter that much because you corrected yourself quickly. If you are dealing with someone who is lower status than you, you come off appearing magnanimous.

You signal that you are a reasonable person who does not let emotional attachment to a position cloud his judgment. If you're dealing with someone of higher status, you show that your mistake doesn't matter that much because you corrected yourself quickly. If you are dealing with someone who is lower status than you, you come off appearing magnanimous.

In many cases this is true, but someone could also interpret this as you as being loose with your morals, one who betrays one's own ideals in a flash (and so are untrustworthy). Or maybe interpret it as you being a follower, who only thinks what people tell you to.

[-][anonymous]12y30

I frequently hear people (especially public figures) criticized for "changing their opinions too much." Obviously, taking on the opinion of whoever you happen to be talking to at the moment is a bad thing, but it's difficult enough to distinguish between the two at the drop of a hat that I think such criticism is definitely problematic.

Ideally, public figures who changed their minds about something would state that they had done so and explain what information changed their opinion. This would let observers gauge whether the person had really changed their mind or was just saying whatever was most expedient. The problem with this is that it takes a long time, and would ideally involve questioning by others ("You used to say X was good because of Y - do you still believe Y?"). If you're making speeches where your goal is to say lots of things that make you look good in the amount of time people will listen, you don't want to spend time on this sort of thing because most of the audience will tune out before they've heard you say much that sounded impressive.

[-][anonymous]12y10

Sure. As you say though, that would be a difficult sell.

I think part of the problem is that "they're changing their opinions too much" is usually a snap judgment. It tends to be applied to everyone that doesn't have a firmly fixed campaign platform (and isn't protected by party affiliation).

People who actually sit down and look over the available history tend to, just as a trend, come back with more concrete issues. More along the lines of "this guy's had a different opinion on foreign policy to go with every speech he's ever made, and it always lines up with the majority opinion of the audience."

Or worse ... a flip-flopper!

I assume that we're talking about opinions on factual matters, not personal values. Yes, one's fundamental (terminal) values I would expect to be pretty stable. Instrumental values are more fluid because they are a function of both one's terminal values and one's state of information about factual matters. It seems to me that one's morals and ideals are tied more closely to terminal values than to instrumental values.

I assume that we're talking about opinions on factual matters, not personal values. Yes, one's fundamental (terminal) values I would expect to be pretty stable.

To my thinking, this stance forfeits rational reflection where it really counts most. You're saying, if I understand you, that you respect people who change their opinions on factual matters, but not on questions of fundamental ethics. This seems to assume, among other things, that people's values are much more coherent than they are (leaving little leverage for change).

You lose much more status, it is true, when you re-evaluate your terminal values than your factual contentions. That just means the problems of self-confirmation are compounded in ethics, not that they should be ignored there. You can't be rational yet rigidly maintain your terminal values' immunity to rational argument.

You can't be rational yet rigidly maintain your terminal values' immunity to rational argument.

Any argument that my terminal values should be one thing or another will itself be founded on certain assumed values. You can't start from a value-neutral position and get to a value system from there.

If rational argument alone is enough to cause a change in one's values, I can see only a few possibilities:

  • The changed values were instrumental values rather than terminal values. It makes perfect sense to modify instrumental values if one no longer believes that they serve the attainment of one's terminal values.

  • The values were incoherent. The rational argument has shown that they are in conflict with each other, making it clear that a choice among them is necessary.

I was going to add the possibility of a value whose subject matter is found not to exist, such as religious values founded on a belief in a god. Some of those values may evaporate after one becomes convinced that there is no god. But even in that case I think one can argue that the religious values really served a more fundamental value -- the desire for self-respect.

[-][anonymous]12y00

Though it is remarkable how few philosophers of ethics have understood terminal values to be subject to rational argument or change on the basis of such argument. Plato is the only one I can think of.

[-][anonymous]12y100

After a discussion where someone didn't admit to any change of mind, I'll often see them later having changed their behavior.

This is an important observation.

This is a good point, but the post doesn't meet my standards of form and content for Main. I think it should have been expanded with examples (real or hypothetical) or evidence, or alternately posted to Discussion.

A sneaky move would be to do something like: "I don't know, umm yeah.. thinking about it from this viewpoint it would be interesting if .. " and suddenly listing previously not said(and usually more tangential) arguments going against your previous argument and supporting your pal's statement. The point is that you somewhat start to act like a judge of your own discussion, accumulating even more status retaining control and the image of a reasonable man.

Completely steal the show. Although it usually works only with one person dialogues, if others are present they would assure your puzzled pal that the credit indeed belongs to him ,)

Another good flip(or reframing) would be "ohh yeah let's continue just for the sake of the argument, to see how far this point could be derived" than defend it with your original intention. If works you have it, if not it's not attached to you anymore.

But sometimes you just feel the insignificance to the universe and let it go with a smile.

I've observed the same behavior, and I've also found that if you give someone an "out" they're more likely to agree:

"I support X!" "Why?" "Well, A, B, C." "Oh! You mean you support Y? Because A+B+C implies Y, so obviously you support Y" "Right, of course. What did you think I meant?"

I've done this a fair few times, and it often produces some absurd about-faces in people. Being confrontational doesn't produce the same changes, so I don't think this is just a matter of them having actually meant Y. I'll often spend some time clarifying they really mean X, not Y, and then do this, and then clarify that yes, they really mean Y, not X now. It's baffling at times.

I haven't followed-up a ton on it, but the changes seem to generally "stick" - they don't just agree with Y to get me to leave them alone.

Basically, the trick is to lead the conversation such that you're not implying that they were ever wrong. It's a weird bit of double-think at first, and can be tricky to adjust to, but I find it's fairly useful quite frequently. If nothing else, by avoiding being confrontational, I seem to leave myself more open to hearing that I'm wrong and really meant Y all along :)

Came across this in Bad Science; thought it gave a nice counterexample.

[addressing an academic] Where you have made errors, perhaps you could simply acknowledge that, and correct them. I will always happily do so myself, and indeed have done so many times, on many issues, and felt no great loss of face. Bad Science, p.180

I understand why people would see admitting that you are wrong as a loss of status, but it doesn't even begin to make sense to me.