In Techniques of the Selling Writer, Dwight W. Swain gives advice on receiving advice:
George Abercroft is an action writer. "Start with a fight!" is his motto. And for him, it works.
But Fred Friggenheimer's witch-cult yarn, as he conceives it, puts heavy emphasis on atmosphere. The fight he tries to stick in like a clove in a ham at the beginning, following George's rule, destroys the mood - and the story.
Even with your own rules, indeed, you must be careful. Because somehow, subtly, they may not apply to this explicit situation. [...]
How do you tell whether a rule is good or not, in terms of a specific problem?
Answer: Find out the reason the rule came into being. What idea or principle stands behind it? [...]
Take George's rule about starting every story with a fight. It's born of George's markets - men's magazines in which the emphasis is on fast, violent action, with blood on page one an absolute must.
If Fred only realized that fact, he'd ignore George's rule when he himself writes a mood-geared story.
One way to reduce damage done by cached thoughts is to cultivate a habit of asking questions about the origin of the thought. Do you remember where you heard the thought? Did it come from someone practicing good epistemic hygiene, or do they just unthinkingly pass on anything they hear? If somebody offered advice based on their own experiences, how representative is their experience? What kinds of experiences have they had that prompted that advice? Are there alternative ways of interpreting those experiences? Or if you're the one offering advice, which you came up with yourself, what situation led you to come up with it? How generalizable is it?
So far I have mostly been framing this as a way to notice flaws in seemingly good advice. But there's also an opposite angle: finding gems in seemingly worthless information.
All outcomes are correlated with causes; most statements are evidence of something. Michael Vassar once gave the example of a tribe of people who thought that faeries existed, lived in a nearby forest, and you could see them once you became old enough. It later turned out that the tribe had a hereditary eye disease which caused them to see things from the corners of their eyes once they got old. The tribe's theory of what was going on was wrong, but it was still based on some true data about the real world. A scientifically minded person could have figured out what was going on, by being sufficiently curious about the data that generated that belief.
If you’re interested in being on the right side of disputes, you will refute your opponents’ arguments. But if you’re interested in producing truth, you will fix your opponents’ arguments for them. To win, you must fight not only the creature you encounter; you must fight the most horrible thing that can be constructed from its corpse. -- Black Belt Bayesian
Some people tend to stop reading a text whenever they come across blatantly incorrect statements. I mind much less. Yes, the person may be generally mistaken, but they may still have some worthwhile points mixed in. Folk theories can be useful, even when they're entirely wrong. What you're reading is somebody's interpretation of an event, which provides information about the event even if the interpretation is wrong. Can you come up with a better interpretation?
Maybe you disagree with something that I've said here? In that case, what data do you think generated this advice? What conclusions would you derive instead?
I bet you could train yourself to be good at remembering "I heard negative evidence against X (from whatever source)" properly, especially if X is something you've either got existing (properly remembered or summarized) evidence for/exist,or have connected to other claims. In other words, probably part of that effect is that the subjects don't accurately understand or recall the sentence they read, and they think "that sounds familiar! wasn't that what I just read from the CDC?"
An inability to remember the strength of some evidence you've heard is already crippling. Misremembering the polarity (whether it's pulling you toward truth or untruth from your prior) is just a particularly bad instance.
What do people with this handicap actually do when they want to properly weigh evidence? Do they write it all down so they can review it (like people find a pros/cons list to be helpful)?
I often remember how some fact or event made me feel at the time. For instance, I'll remember being moved by a film years later, but perhaps be quite fuzzy on even the broad strokes of the plot. I'd like to exploit this sort of memory in order to represent the direction+strength of evidence - to not remember being excited to read some study, but to remember its value.
Another technique that seems useful for uncertain (but interesting or important) claims that are updated over a long period of time is using fixed nametags (not much more complicated than the title of this excellent post 'What data generated that thought?'), especially in writing or talking about it.