A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
WrongBot: Brendan Nyhan, the Robert Wood Johnson scholar in health policy research at the University of Michigan, spoke today on Public Radio's "Talk of the Nation" about a bias that may be reassuring to you. He calls it the "backfire effect". He says new research suggests that misinformed people rarely change their minds when presented with the facts -- and often become even more attached to their beliefs. The Boston Globe reviews the findings here as they pertain to politics. If this is correct, it seems quite likely that if you have strong anti-bigot beliefs, and you are exposed to "dangerous factual thoughts" that might conceivably sway you toward bigotry, the backfire effect should make you defend your original views even more vigorously, thus acting as a protective bias. OTOH, while listening, I wondered, "Is Nyhan saying that the only factual positions one can assume are those about which one had no previous opinion or knowledge?" Best wishes to overcome your phobia.