A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
"A little learning is a dang'rous thing;
Drink deep, or taste not the Pierian spring:
There shallow draughts intoxicate the brain,
And drinking largely sobers us again."
-- Pope
That sounds like my (provisional) resolution the conflict between "using all you know" and "don't be a bigot": you should incorporate the likelihood ratio of things that a person can't control, so long as you also observe and incorporate evidence that could outweigh such statistical, aggregate, nonspecific knowledge.
So drink deep (use all evidence), but if you don't, then avoid incorporating "dangerous knowledge" as a second best alternative. Apply a low Bayes factor for something someone didn't choose, as long as you give them a chance to counteract it with other evidence.
(Poetry still sucks, though. I'm not yet changing my mind about that.)