A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
I think we are seeing that among the for now (fortunately) small group of relatively intelligent unconformist people who change their opinion on this subject once looking at the data.
It biases them towards unduly sympathetic judgements of everyone else who happens to hold the same opinion.
or
or eventually
Leaking unconfromist, driven, principled (as in truth seeking even when it costs them status) intelligent people to otherwise unworthy causes? This may prove to be dangerous in the long term.
One can't overestimate the propaganda value of calling out a well intentioned lie out as a lie and then proving that it actually is, you know, a lie. Our biases make us very vulnerable to be overly suspicious of someone who has been shown to be a liar. This is doubly true of our tendency to question their motives.