Something very weird happened to me today after reading this paragraph in the article yesterday:
Another particularly well-documented case of the persistence of mistaken beliefs despite extensive corrective efforts involves the decades-long deceptive advertising for Listerine mouthwash in the U.S. Advertisements for Listerine had falsely claimed for more than 50 years that the product helped prevent or reduce the severity of colds and sore throats.
I had not known earlier that Listerine had claimed to alleviate colds and sore throats. Today morning, as I was using my Listerine mouthwash, I felt as though the Listerine was helping my sore throat. Not deliberatively of course, but instantaneously. And my mind also instantaneously constructed a picture where the mouthwash was killing germs in my throat. This happened after I learned about the claim from a source whose only reason for mentioning it was that it was false. From a source about the dangers of misinformation.
Misinformation is more insidious that I suspected.
For example, Berinsky (2012) found that among Republicans, corrections of the death-panel myth were effective primarily when they were issued by a Republican politician. However, judgments of a source’s credibility are themselves a function of beliefs: If you believe a statement, you judge its source to be more credible (Fragale & Heath, 2004). This interaction between belief and credibility judgments can lead to an epistemic circularity, whereby no opposing information is ever judged sufficiently credible to overturn dearly held prior knowledge. For example, Munro (2010) has shown that exposure to belief-threatening scientific evidence can lead people to discount the scientific method itself: People would rather believe that an issue cannot be resolved scientifically, thus discounting the evidence, than accept scientific evidence in opposition to their beliefs.
Amusingly, after I had read this bit, I realized that I had been trusting this article and its authors more because the claims it made sounded credible. That's a pretty vicious cycle: if somebody says things that support your beliefs, you judge them to be more credible, and then your beliefs get stronger because credible sources support them... maybe I should try to correct for that by trusting this article less. :)
Caveats:
I haven't figured out a way to test this
Sample size of one
However; I believe I can often feel a characteristic affective state when my beliefs have just updated, and use that to trigger a manual review where I try to figure out what beliefs changed, what triggered the change, and whether I consciously endorse the change.
Provide an explicit warning before mentioning a myth ... Use clear language and graphs where appropriate. If the myth is simpler and more compelling than your debunking, it will be cognitively more attractive, and you will risk an overkill backfire effect.
This crystallised a few of the reasons I respect the Sequences as excellent writing.
http://psi.sagepub.com/content/13/3/106.full
This is a fascinating article with many, many interesting points. I'm excerpting some of them below, but mostly just to get you to read it: if I were to quote everything interesting, I'd have to pretty much copy the entire (long!) article.