A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
The rest of the post was good, but these claims seem far too anecdotal and availability heuristicky to justify blocking yourself out of an entire area of inquiry.
When well-meaning, intelligent people like yourself refuse to examine certain areas of controversy, you consign those discourses to people with less-enlightened social attitudes. When certain beliefs are outlawed, only outlaws will hold those beliefs.
SarahC has raised some alternative ideas about how people may respond to dangerous knowledge.
As for:
Why are you so comfortable with such a hasty generalization? I'm not extremely widely-read on the subject of group differences, but I've run into some writing on the subject by people who doesn't seem to be bigots. See Gender, Nature, and Nurture by Richard Lippa, for instance.
Why would you make a hasty generalization and then shut yourself off to evidence that could disconfirm it?
Your post itself demonstrates this. You are accepting certain empirical and moral beliefs that have not been justified, such as the notion of cognitive equality between groups. Regardless of whether this hypothesis is true or not, it seems to get inordinately privileged for ideological reasons. (In my view, suspended judgment on group differences is a more rational initial attitude.)
Privileging certain hypotheses for mainly ideological reasons is not rationality, even when your ideology is really warm and fuzzy.
If you are comfortable freezing your belief system in certain areas, that's a strong symptom that your mind got hacked somewhere, and the virus is so bad that it is disabled your own epistemic immune system.
Personally, like simplicio, I'm not comfortable pulling an ostrich maneuver and basing my values on empirical notions that could turn out to be lies. What a great way to destroy my own conviction in my values! I would prefer to investigate these subjects, even at risk of shaking up my values. So far, like SarahC, I haven't found my values to be shaken up all that much (though maybe I'm biased in that perception).
I think it may be helpful to clearly distinguish between epistemic and instrumental rationality. The idea proposed in this post is actively detrimental to the pursuit of epistemic rationality; I should have acknowledged that more clearly up front.
But if one is more concerned with instrumental rationality ("winning"), then perhaps there is more value here. If you've designated a particular goal state as a winning one and then, after playing for a while, unconsciously decided to change which goal state counts as a win, then from the perspective of the you that began the game, you've lost.
I do agree that my last example was massively under-justified, especially considering the breadth of the claim.