How facts backfire (previous discussion) discusses the phenomenon where correcting people's mistaken beliefs about political issues doesn't actually make them change their minds. In fact, telling them the truth about things might even reinforce their opinions and entrench them even firmer in their previous views. "The general idea is that it’s absolutely threatening to admit you’re wrong", says one of the researchers quoted in the article.

This should come as no surprise to the people here. But the interesting bit is that the article suggests a way to make people evaluate information in a less biased manner. They mention that one's willingness to accept contrary information is related to one's self-esteem: Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t.

I suspect that the beliefs that are the hardest to change, even if the person had generally good self-esteem, are those which are central to their identity. If someone's identity is built around capitalism being evil, or socialism being evil, then any arguments about the benefits of the opposite economical system are going to fall on deaf ears. Not only will that color their view of the world, but it's likely that they're deriving a large part of their self-esteem from that identity. Say something that challenges the assumptions built into their identity, and you're attacking their self-esteem.

Keith Stanovich tells us that simply being intelligent isn't enough to avoid bias. Intelligent people might be better at correcting for bias, but there's no strong correlation between intelligence and the disposition to actually correct for your own biases. Building on his theory, we can assume that threatening opinions will push even non-analytical people into thinking critically, but non-threatening ones won't. Stanovich believes that spreading awareness of biases might be enough to help a lot of people, and to some degree it might. But we also know about the tendency to only use your awareness of bias to attack arguments you don't like. In the same way that telling people facts about politics sometimes only polarizes opinions, telling people about biases might similarly only polarize the debate as everyone thinks their opposition is hopelesly deluded and biased.

So we need to create a new thinking disposition, not just for actively attacking the perceived threats, but for critically evaluating your opinions. That's hard. And I've found for a number of years now that the main reason I try to actively re-evaluate my opinions and update them as necessary is because doing so is part of my identity. I pride myself on not holding onto ideology and for changing my beliefs when it feels like they should be changed. Admitting that somebody else is right and I am wrong does admittedly hurt, but it also feels good that I was able to do so despite the pain. And when I'm in a group where everyone seems to agree about something as self-evident, it frequently works as a warning sign that makes me question the group consensus. Part of the reason why I do that is that I enjoy the feeling of knowing that I'm actively on guard against my mind just adopting whatever belief happens to be fashionable in the group I'm in.

It seems to me that if we want to actually raise the sanity waterline and make people evaluate things critically, and not just conform to different groups than is the norm, a crucial part of that is getting people to adopt an identity of critical thinking. This way, the concept of identity ceases to be something that makes rational thinking harder and starts to actively aid it. I don't really know how one can effectively promote a new kind of identity, but we should probably take lessons from marketers and other people who appeal strongly to emotions. You don't usually pick your identity based on logical arguments. (On the upside, this provides a valuable hint to the question of how to raise rationalist children.)

New Comment
26 comments, sorted by Click to highlight new comments since: Today at 2:14 PM

Kaj_Sotala:

It seems to me that if we want to actually raise the sanity waterline and make people evaluate things critically, and not just conform to different groups than is the norm, a crucial part of that is getting people to adopt an identity of critical thinking. This way, the concept of identity ceases to be something that makes rational thinking harder and starts to actively aid it.

I don't think that could ever work. Just look at the present situation. A great many people nowadays feel that "critical thinking," "open-mindedness," "questioning authority," etc. are important parts of their identity, and will take offense if you suggest otherwise. The modern culture strongly encourages such attitudes. Yet, in practice, this nearly always results in cargo-cult "critical thinking" where one is merely supposed to display the correct shibboleths, accept the prevailing respectable beliefs, and avoid like plague any actual critical thinking about the truly sacrosanct taboos, values, and moral and intellectual authorities.

The old "We are all individuals!" sketch comes to mind.

Yes, it is true that the people who most speak about questioning authorities, independent thinking and open-mindedness are conspiracy theorists and other crackpots.

On the other hand, I suspect the reason for that is partly that we are taught that it is good to think critically, while nobody actually explains how to think critically. For example, "question authority!" is a pretty poor advice when not explained in greater detail, and this is how we usually get it. No wonder a lot of people interpret it as an endorsement for indiscriminate questioning, which may be translated as "believe whatever you want".

I suspect that if biases and rational thinking were taught in schools, probably less people would describe themselves as "rational", but a lot of people would be far better thinkers than they are today.

Yup. I've often seen "critical thinking" and "questioning authority" as codewords for criticizing politically incorrect things.

I completely agree that promoting a "critical thinker" identity alone could ever work. The hard part is in promoting it in such a way that as few people as possible actually end up with the identity for a cargo-cult rationalist. But as prase points out below, giving people the tools that will actually allow them to be critical thinkers should help out a lot.

Your general point, that superficially critical thinking makes raising the sanity waterline even more difficult than it otherwise would be, is well-taken.

I don't think that could ever work. Just look at the present situation.

Isn't this, all by itself, an example of improper reasoning?

Isn't this, all by itself, an example of improper reasoning?

Yes, you are right -- we can't draw such a blanket conclusion just from observing the present situation. My choice of words wasn't very good there.

However, my conclusion is actually based on more than that, namely a more detailed consideration of both the human nature and the wider historical precedent. Unfortunately, it's a topic too complex to be discussed satisfactorily in a single comment, so I just wanted to draw attention to these unpleasant facts that are undoubtedly relevant for the point of the original post.

ISTM we should encourage a default norm of respect for people who change their minds on serious topics, even if we disagree with the switch or the reasoning (within bounds). Yes, this norm could be gamed if it goes too far, but right now the norm is that changing one's mind is an admission of weakness, and that depresses the sanity waterline considerably.

ETA: Sorry for the ambiguity— as you guys point out, we already do this regularly on LW. It's in real life that this needs to start happening.

Really? I've seen (disproportionately?) large amounts of karma heaped on those who change their mind or admit they were mistaken. Do you have any examples otherwise?

Added: I was assuming you were talking about LW; if you're talking about society generally, then never mind. :)

I got that impression too - what orthonormal describes applies to society in general, but I don't think it applies to less wrong.

We already seem to do this. To use an example that stands out in my mind, here I got upvoted to +10 for saying I was wrong.

I don't really know how one can effectively promote a new kind of identity, but we should probably take lessons from marketers and other people who appeal strongly to emotions.

I can't find the blog post right now, but I'm pretty sure economist Bryan Caplan has written that when he's lecturing on some point of economics that counters a widely-held economically-illiterate bias, he frames it like "people in this room are smart enough to understand X, unlike the economically illiterate laypeople outside". So it helps to have an out-group. The Robbers Cave Experiment is similar, as is the story from Dan Ariely's TED talk, where people cheat less if they see people from a rival university cheating.

So a way to encourage people to embrace a rationalist identity might be to choose some out-group we already hate (maybe terrorists) and emphasize that we are not like those dangerously irrational lunatics who like blind faith and persecuting heretics; we like science and evidence and listening carefully to intellectual criticism.

So a way to encourage people to embrace a rationalist identity might be to choose some out-group we already hate (maybe terrorists) and emphasize that we are not like those dangerously irrational lunatics who like blind faith and persecuting heretics; we like science and evidence and listening carefully to intellectual criticism.

This might in the long-run have bad side-effects if it demonizes the out-groups too much. Also, it requires making potentially false statements about the outgroup. Finally, note that this strategy has been more or less implicitly tried in American politics with some trying to portray the left-wing as "pro-science" and critical thinking, yet many forms of irrational beliefs, such as a lot of alternative medicine claims and anti-vaccine claims, are primarily on the left (anti-vaccine issues have become more common among the religious right post the HPV vaccine but it is still a primarily left-wing phenomenon). So in practice this doesn't seem to work well.

a lot of alternative medicine claims and anti-vaccine claims[] are primarily on the left (anti-vaccine issues have become more common among the religious right post the HPV vaccine but it is still a primarily left-wing phenomenon).

Well, in fairness, the religious right opposes the HPV vaccine based on a value difference, not a factual difference. They oppose it because they think it will promote sex, but they don't challenge the scientific facts about the vaccine.

Actually, while they in part have a value difference here, they've also used it as an opportunity to pick up a lot of the standard anti-vaccine claims. See for example this World Net Daily piece, this Conservapedia article, this Conservapedia HPV FAQ. In this particular case, as frequently happens, people try to adjust their perception of reality so that their ideological beliefs and reality happen to conveniently agree.

Whether the vaccine will promote sex is an empirical matter.

Good point. Let me fix that:

Well, in fairness, the religious right opposes the HPV vaccine based on a value difference, not a factual difference. That is, even if all the facts about the vaccine and its effects on sexual behavior are agreed upon, there is still a value dispute about how society should deal with sex. Many on the religious right oppose the vaccine because they don't believe in treating the risks of sex as medical problems that can be prevented, even though they don't challenge the scientific facts about the vaccine.

[-][anonymous]14y100

I don't necessarily think that "an identity of critical thinking" is the answer.

I've renounced beliefs, several times, that I thought were central to my identity, but weren't backed up very well by the evidence. The pattern was that I would think about what I actually wanted to protect -- what really mattered to me -- and found that it wasn't quite the same as the (false) belief in question, and that I could be truer to myself without illusions.

For example, someone who thinks capitalism is evil may, at root, be someone who really cares about preventing human suffering. You can still do that -- you can do it even better -- if you don't subscribe to inaccurate information. Same thing for someone who believes that socialism is evil -- it might be a person who really values autonomy. You can still do that without swallowing malarkey.

I don't deny that people can often renounce important beliefs even if they didn't have a "critical thinker" identity. But I do think that having such an identity will make it far more likely for them to renounce beliefs once they realize the beliefs don't work, and also make them more likely to actively go out questioning their beliefs (as opposed to just renouncing incorrect ones once confronted).

That works well for idealistic folk -- what about people who are more selfish or more apathetic?

"a crucial part of that is getting people to adopt an identity of critical thinking"

What to do about folks who believe/trumpet that they have adopted such an identity, and use this as a shield against conflicting arguments?

"an identity of critical thinking" sounds awfully like the skeptics community to me. So I think it has been tried. Maybe "actively re-evaluate my opinions" is a new slogan, but I'm skeptical of the power of slogans to create actions, rather than just as clique-labels. In particular, labels of the form "I'm better than them because I do X" sound particularly dangerous.

main reason I try to actively re-evaluate my opinions and update them as necessary is because doing so is part of my identity. I pride myself on not holding onto ideology and for changing my beliefs when it feels like they should be changed.

In the course of my life, I have often had to eat my words, and I must confess that I have always found it a wholesome diet. -- Winston Churchill

Great quote ... but I'm having some trouble sourcing it. Could be apocryphal.

[-][anonymous]14y30

There are three goals that are discussed here a lot, that I would like to see untangled. (Or else I'd like to be convinced that they are not untangleable)

  1. The goal of believing true things and disbelieving false things.
  2. The goal of causing others to believe true things
  3. The goal of promoting, in the general public, belief-in-true-things as a value.

I can restate 3. with even worse prose but possibly more clarity: The goal of causing others to want to believe true things.

I am quite interested in 1. for what I consider idiosyncratic reasons, and not so interested in 2. or 3. Am I making a mistake?

About 2. and 3., a probably-incoherent thought experiment : what would be the result of causing, by magic, a person to discard their false beliefs without altering their values? If you have a Robin-Hanson-style model of human beings as giant hypocrites, you might expect to replace happy hypocrites who lie to themselves by unhappy hypocrites who don't lie to themselves. I'd be interested to hear what less cynical rationalists would expect.

Wish I had time to read the comments, but for me, I started to incorporate a rationalist identity after reading the Beisutsukai stories; they made rationality cool.

[-][anonymous]14y00

"an identity of critical thinking" sounds awfully like the skeptics community to me. So I think it has been tried. Maybe "actively re-evaluate my opinions" is a new slogan, but I'm skeptical of the power of slogans to create actions, rather than just as clique-labels. In particular, labels of the form "I'm better than them because I do X" sound particularly dangerous.