In Techniques of the Selling Writer, Dwight W. Swain gives advice on receiving advice:
George Abercroft is an action writer. "Start with a fight!" is his motto. And for him, it works.
But Fred Friggenheimer's witch-cult yarn, as he conceives it, puts heavy emphasis on atmosphere. The fight he tries to stick in like a clove in a ham at the beginning, following George's rule, destroys the mood - and the story.
Even with your own rules, indeed, you must be careful. Because somehow, subtly, they may not apply to this explicit situation. [...]
How do you tell whether a rule is good or not, in terms of a specific problem?
Answer: Find out the reason the rule came into being. What idea or principle stands behind it? [...]
Take George's rule about starting every story with a fight. It's born of George's markets - men's magazines in which the emphasis is on fast, violent action, with blood on page one an absolute must.
If Fred only realized that fact, he'd ignore George's rule when he himself writes a mood-geared story.
One way to reduce damage done by cached thoughts is to cultivate a habit of asking questions about the origin of the thought. Do you remember where you heard the thought? Did it come from someone practicing good epistemic hygiene, or do they just unthinkingly pass on anything they hear? If somebody offered advice based on their own experiences, how representative is their experience? What kinds of experiences have they had that prompted that advice? Are there alternative ways of interpreting those experiences? Or if you're the one offering advice, which you came up with yourself, what situation led you to come up with it? How generalizable is it?
So far I have mostly been framing this as a way to notice flaws in seemingly good advice. But there's also an opposite angle: finding gems in seemingly worthless information.
All outcomes are correlated with causes; most statements are evidence of something. Michael Vassar once gave the example of a tribe of people who thought that faeries existed, lived in a nearby forest, and you could see them once you became old enough. It later turned out that the tribe had a hereditary eye disease which caused them to see things from the corners of their eyes once they got old. The tribe's theory of what was going on was wrong, but it was still based on some true data about the real world. A scientifically minded person could have figured out what was going on, by being sufficiently curious about the data that generated that belief.
If you’re interested in being on the right side of disputes, you will refute your opponents’ arguments. But if you’re interested in producing truth, you will fix your opponents’ arguments for them. To win, you must fight not only the creature you encounter; you must fight the most horrible thing that can be constructed from its corpse. -- Black Belt Bayesian
Some people tend to stop reading a text whenever they come across blatantly incorrect statements. I mind much less. Yes, the person may be generally mistaken, but they may still have some worthwhile points mixed in. Folk theories can be useful, even when they're entirely wrong. What you're reading is somebody's interpretation of an event, which provides information about the event even if the interpretation is wrong. Can you come up with a better interpretation?
Maybe you disagree with something that I've said here? In that case, what data do you think generated this advice? What conclusions would you derive instead?
The "Black Belt Bayesian" website appears to have been hijacked into offering malware.
Cute story; citation needed.
No idea about the tribe, but the rest sounds like http://en.wikipedia.org/wiki/Charles_Bonnet_syndrome