In response to comment by [deleted] on Overcoming the Curse of Knowledge
Comment author: Nornagest 18 October 2011 10:21:31PM 2 points [-]

A fine sentiment, but the emoticon takes something from it.

Comment author: JesseGalef 18 October 2011 11:48:05PM 8 points [-]

Indeed - when I was young, we didn't use emoticons. We typed "emote smile" and let the MUD client fill in the rest.

... Too nerdy?

Comment author: byrnema 18 October 2011 10:39:56PM 4 points [-]

They weren't playing the game right. The way to correctly play the game, especially among siblings, is for a person to always pick the same song. For example, if I'm tapping, I'm tapping 'Jingle Bells'. And we have a near 100% success rate. (It is not quite 100% due to the initial learning curve, but the success rate then steadily improves over time.)

That said... I'm tapping a tune as I type on this keyboard. Can you guess it??

Comment author: JesseGalef 18 October 2011 11:45:09PM *  3 points [-]

Love it!

That brings to mind a fantastic set of posts on Mind Your Decisions (game theory blog) about focal points and coordination problems. If there's anything identifying about one of the songs - even being first on the list - it's a good idea to choose that one.

... Man, I bet psych researchers hate people like us.

Comment author: gwern 18 October 2011 08:35:49PM 1 point [-]

Interesting post, but one of your commenters was right, I think - at least, I thought I knew all the active PBers, and you don't seem to be one of them.

Comment author: JesseGalef 18 October 2011 08:47:31PM 3 points [-]

I'm mostly been using it to track my predictions about the winner of each football game, but have my preferences set to leave predictions private.

As expected, I'm inappropriately confident at most levels of "confidence feeling" except the very high levels, where my accuracy can be more attributed to luck and a small sample size.

Comment author: gwern 18 October 2011 06:50:50PM 15 points [-]

No links for 'inferential distance'? The phrase itself is an inferential distance...

Comment author: JesseGalef 18 October 2011 07:28:53PM *  4 points [-]

Inferential distance is an extremely handy phrase. I was actually unaware of it (an example of distance?) until today, but it's definitely related!

(On an off-topic note, this is my first post on LW and my first chance to tell you that I mentioned you in a post I wrote when I found Prediction Book: (This site isn’t new to rationalists: Eliezer and the LessWrong community noticed it a couple years ago, and LessWrong’er Gwern has been using it to – among other things – track inTrade predictions.)

Comment author: billswift 18 October 2011 06:13:50PM 4 points [-]

You mean overconfident by a factor of 20.

Comment author: JesseGalef 18 October 2011 06:21:18PM *  3 points [-]

Thanks, good catch!

[EDIT: For the record, I had accidentally written "by a factor of 40." I corrected it in the article for future readers.]

Overcoming the Curse of Knowledge

42 JesseGalef 18 October 2011 05:39PM

[crossposted at Measure of Doubt]

What is the Curse of Knowledge, and how does it apply to science education, persuasion, and communication? No, it's not a reference to the Garden of Eden story. I'm referring to a particular psychological phenomenon that can make our messages backfire if we're not careful.

Communication isn't a solo activity; it involves both you and the audience. Writing a diary entry is a great way to sort out thoughts, but if you want to be informative and persuasive to others, you need to figure out what they'll understand and be persuaded by. A common habit is to use ourselves as a mental model - assuming that everyone else will laugh at what we find funny, agree with what we find convincing, and interpret words the way we use them. The model works to an extent - especially with people similar to us - but other times our efforts fall flat. You can present the best argument you've ever heard, only to have it fall on dumb - sorry, deaf - ears.

That's not necessarily your fault - maybe they're just dense! Maybe the argument is brilliant! But if we want to communicate successfully, pointing fingers and assigning blame is irrelevant. What matters is getting our point across, and we can't do it if we're stuck in our head, unable to see things from our audience's perspective. We need to figure out what words will work.

Unfortunately, that's where the Curse of Knowledge comes in. In 1990, Elizabeth Newton did a fascinating psychology experiment: She paired participants into teams of two: one tapper and one listener. The tappers picked one of 25 well-known songs and would tap out the rhythm on a table. Their partner - the designated listener - was asked to guess the song. How do you think they did?

Not well. Of the 120 songs tapped out on the table, the listeners only guessed 3 of them correctly - a measly 2.5 percent. But get this: before the listeners gave their answer, the tappers were asked to predict how likely their partner was to get it right. Their guess? Tappers thought their partners would get the song 50 percent of the time. You know, only overconfident by a factor of 20. What made the tappers so far off?

They lost perspective because they were "cursed" with the additional knowledge of the song title. Chip and Dan Heath use the story in their book Made to Stick to introduce the term:

 

"The problem is that tappers have been given knowledge (the song title) that makes it impossible for them to imagine what it's like to lack that knowledge. When they're tapping, they can't imagine what it's like for the listeners to hear isolated taps rather than a song. This is the Curse of Knowledge. Once we know something, we find it hard to imagine what it was like not to know it. Our knowledge has "cursed" us. And it becomes difficult or us to share our knowledge with others, because we can't readily re-create our listeners' state of mind."

 

So it goes with communicating complex information. Because we have all the background knowledge and understanding, we're overconfident that what we're saying is clear to everyone else. WE know what we mean! Why don't they get it? It's tough to remember that other people won't make the same inferences, have the same word-meaning connections, or share our associations.

It's particularly important in science education. The more time a person spends in a field, the more the field's obscure language becomes second nature. Without special attention, audiences might not understand the words being used - or worse yet, they might get the wrong impression.

Over at the American Geophysical Union blog, Callan Bentley gives a fantastic list of Terms that have different meanings for scientists and the public.

What great examples! Even though the scientific terms are technically correct in context, they're obviously the wrong ones to use when talking to the public about climate change. An inattentive scientist could know all the material but leave the audience walking away with the wrong message.

We need to spend the effort to phrase ideas in a way the audience will understand. Is that the same as "dumbing down" a message? After all, complicated ideas require complicated words and nuanced answers, right? Well, no. A real expert on a topic can give a simple distillation of material, identifying the core of the issue. Bentley did an outstanding job rephrasing technical, scientific terms in a way that conveys the intended message to the public.

That's not dumbing things down, it's showing a mastery of the concepts. And he was able to do it by overcoming the "curse of knowledge," seeing the issue from other people's perspective. Kudos to him - it's an essential part of science education, and something I really admire.

P.S. - By the way, I chose that image for a reason: I bet once you see the baby in the tree you won’t be able to ‘unsee’ it. (image via Richard Wiseman)

Comment author: pedanterrific 18 October 2011 05:33:24AM 1 point [-]

Are you volunteering for the post of LessWrong's DADA professor? The space is open if you want it, though Yvain has previously submitted an application. It should also be noted that a certain someone doesn't seem interested in the job (probably a good thing, on balance).

Comment author: JesseGalef 18 October 2011 05:34:46AM 4 points [-]

That depends - would I die horribly and mysteriously after a year?

Comment author: Eliezer_Yudkowsky 18 October 2011 04:58:43AM *  20 points [-]

Some questions to ask:

  • Am I making people stronger, or weaker?
  • What would they think if they knew exactly what I was doing?
  • If lots of people used this technique, would the world be better off or worse off? Is that already happening and am I just keeping pace? Am I being substantially less evil than average?
  • Is this the sort of Dark Art that corrupts anything it touches (like telling people to have faith) or is it more neutral toward the content conveyed (like using colorful illustrations or having a handsome presenter speak a talk)?

(I've recently joked that SIAI should change its motto from "Don't be jerks" to "Be less evil than Google".)

Comment author: JesseGalef 18 October 2011 05:27:22AM 3 points [-]

Great questions!

Regarding the second one, "What would [people] think if they knew exactly what I was doing?" - I absolutely agree that it's important as a pragmatic issue. If someone will get upset by a technique - justified or not - we need to factor that into the decision to use it.

But do you think their discomfort is a sign that the technique is unethical in any meaningful sense, or merely socially frowned upon? Society tends to form its conventions for a reason, but those reasons aren't necessarily tied to a consistent conception of morality.

That said, I agree that if people get upset by a practice, it's a good warning sign that the practice could be unethical and merits careful thought. ...Which could be exactly what you meant by asking the question.

By the way, I'm looking forward to meeting you at Skepticon next month - I'll be moderating a panel you'll be on!

Comment author: lessdazed 18 October 2011 04:41:11AM *  0 points [-]

Particular persuasion techniques are called different things depending on if they are used ethically.

Comment author: JesseGalef 18 October 2011 04:49:07AM 0 points [-]

That's one useful way to make a distinction! And, honestly, probably the one I lean toward. That's probably the way I'd use the words, but even so I'm trying to figure out whether there's a sensible and coherent way to call a persuasion technique unethical as a reflection on the technique, rather than solely the consequences.

I've thought about it another way - if a particular technique is far easier (and more likely) to be used in a way that reduces utility than it is to use in a positive way, society should be wary of it, and perhaps call it an unethical practice. I'm thinking of some alleged pick-up artist techniques that are based on lowering a woman's self-esteem and sense of self-worth. (Disclaimer: this is second or third-hand information about PUA, so I could be misrepresenting it. Regardless of whether it's practiced by PUA, the hypothetical holds.)

Comment author: pedanterrific 18 October 2011 04:29:24AM *  2 points [-]

Bienvenidos, Jesse!

"Does it make sense to call a particular persuasion technique unethical? Or does it entirely depend on how it's used?"

You may or may not be aware, but this has been discussed at some length around these parts; Dark Arts is an okay summary. (Edit: A particularly good post on the subject is NTLing.) If you've already read it and think the topic could stand more elaboration, though, I'm with you.

Oh, and "professional atheist"? Totally awesome.

Comment author: JesseGalef 18 October 2011 04:41:48AM 0 points [-]

Thanks for the tip!

I've come across some of this material, but haven't read it in a systematic way. I very occasionally refer to persuasion as 'the dark arts' - I think that phrase/connection came from LW originally.

Earlier this year I gave a talk on the psychology of persuasion, synthesizing some of the fascinating studies that have been done. Rather than present the most blatant techniques as manipulation, I framed them as known weaknesses in our minds that could be exploited if we weren't wary and aware. Thus: defense against dark arts. Combining rationality and Harry Potter! Hey, that would be a great fanfiction! (Yes, I'm aware of Harry Potter and the Methods of Rationality and have done my best to spread it far and wide.)

Thanks for the support regarding my job: I've loved doing it and hope to do more for the secular movement!

View more: Prev | Next