[crossposted at Measure of Doubt]
What is the Curse of Knowledge, and how does it apply to science education, persuasion, and communication? No, it's not a reference to the Garden of Eden story. I'm referring to a particular psychological phenomenon that can make our messages backfire if we're not careful.
Communication isn't a solo activity; it involves both you and the audience. Writing a diary entry is a great way to sort out thoughts, but if you want to be informative and persuasive to others, you need to figure out what they'll understand and be persuaded by. A common habit is to use ourselves as a mental model - assuming that everyone else will laugh at what we find funny, agree with what we find convincing, and interpret words the way we use them. The model works to an extent - especially with people similar to us - but other times our efforts fall flat. You can present the best argument you've ever heard, only to have it fall on dumb - sorry, deaf - ears.
That's not necessarily your fault - maybe they're just dense! Maybe the argument is brilliant! But if we want to communicate successfully, pointing fingers and assigning blame is irrelevant. What matters is getting our point across, and we can't do it if we're stuck in our head, unable to see things from our audience's perspective. We need to figure out what words will work.
Unfortunately, that's where the Curse of Knowledge comes in. In 1990, Elizabeth Newton did a fascinating psychology experiment: She paired participants into teams of two: one tapper and one listener. The tappers picked one of 25 well-known songs and would tap out the rhythm on a table. Their partner - the designated listener - was asked to guess the song. How do you think they did?
Not well. Of the 120 songs tapped out on the table, the listeners only guessed 3 of them correctly - a measly 2.5 percent. But get this: before the listeners gave their answer, the tappers were asked to predict how likely their partner was to get it right. Their guess? Tappers thought their partners would get the song 50 percent of the time. You know, only overconfident by a factor of 20. What made the tappers so far off?
They lost perspective because they were "cursed" with the additional knowledge of the song title. Chip and Dan Heath use the story in their book Made to Stick to introduce the term:
"The problem is that tappers have been given knowledge (the song title) that makes it impossible for them to imagine what it's like to lack that knowledge. When they're tapping, they can't imagine what it's like for the listeners to hear isolated taps rather than a song. This is the Curse of Knowledge. Once we know something, we find it hard to imagine what it was like not to know it. Our knowledge has "cursed" us. And it becomes difficult or us to share our knowledge with others, because we can't readily re-create our listeners' state of mind."
So it goes with communicating complex information. Because we have all the background knowledge and understanding, we're overconfident that what we're saying is clear to everyone else. WE know what we mean! Why don't they get it? It's tough to remember that other people won't make the same inferences, have the same word-meaning connections, or share our associations.
It's particularly important in science education. The more time a person spends in a field, the more the field's obscure language becomes second nature. Without special attention, audiences might not understand the words being used - or worse yet, they might get the wrong impression.
Over at the American Geophysical Union blog, Callan Bentley gives a fantastic list of Terms that have different meanings for scientists and the public.
What great examples! Even though the scientific terms are technically correct in context, they're obviously the wrong ones to use when talking to the public about climate change. An inattentive scientist could know all the material but leave the audience walking away with the wrong message.
We need to spend the effort to phrase ideas in a way the audience will understand. Is that the same as "dumbing down" a message? After all, complicated ideas require complicated words and nuanced answers, right? Well, no. A real expert on a topic can give a simple distillation of material, identifying the core of the issue. Bentley did an outstanding job rephrasing technical, scientific terms in a way that conveys the intended message to the public.
That's not dumbing things down, it's showing a mastery of the concepts. And he was able to do it by overcoming the "curse of knowledge," seeing the issue from other people's perspective. Kudos to him - it's an essential part of science education, and something I really admire.
P.S. - By the way, I chose that image for a reason: I bet once you see the baby in the tree you won’t be able to ‘unsee’ it. (image via Richard Wiseman)
Imagine that the tapper is familiar with a song and expects the listener to be familiar with it, but he doesn’t know the song’s title, or any word that he would expect the listener to recognise as a name for it. If he taps out a rhythm from the song, he will have the song ‘playing’ in his head as he does so. This might lead him to overestimate how accurate the rhythm is, or how easy the song is to distinguish from that rhythm alone. When he hears the rhythm that he is tapping out, internally he also hears the song, making it very difficult to judge what it would be like for someone just to hear the rhythm.
Although it would be difficult to create an experiment in which knowledge of a popular song’s title and “musical knowledge” of the song are rigorously separated, we see that the tappers have two kinds of knowledge that the listener lacks: the song title, and musical knowledge of the song – i.e. the ability to replay the song with decent fidelity in one’s mind.
Chip and Dan Heath suggest that knowledge of the song’s title is the critical knowledge that causes the tappers to be overconfident about how likely the listeners are to recognise the song. This is a complex statement about the way in which the human brain works, but we don’t yet understand the brain in great detail; as far as I am aware there is no particular neuroscientific evidence that should cause us to believe that knowledge of the song title, rather than musical knowledge of the song, is the crux of the problem. Therefore I am not inclined to view their explanation as authoritative, and (assuming that there isn't just some problem with the experiment) on the strength of introspection I lean towards the idea that musical knowledge has the greater effect in causing this overconfidence.
The failure to try to correct for this bias could be regarded as an example of irrationally expecting short inferential distances, but I view that as being more related to the understanding of words and concepts - complex areas of the map, that are charted on one person's map in a particular way but not on another's. I think a better fit would be the mind projection fallacy: the error of projecting the properties of one's own mind into the external world. In this case the property being projected is the person's internal soundtrack accompanying the rhythm that he is physically tapping.
Actually, expecting short inferential differences is a special case of the mind projection fallacy – the property being projected is the detailed structure of words and reductions in someone’s map of the territory; the person in question fails to realise that the reductions, the causal relationships and the subtle definitional changes that accompany words in his mental model do not accompany those words when they are processed by the brains of other people.
So, “the curse of knowledge” is essentially the problem of the mind projection fallacy as it applies to a given person’s level of knowledge (of various kinds) with respect to the knowledge possessed by others, and this knowledge need not be encoded verbally.