A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
Bryan Caplan argues against the "corrupted by power" idea with an alternative view: they were corrupt from the start, which is why they were willing to go to such extremes to attain power.
Around the time I stopped believing in God and objective morality I came around to Stirners' view: such values are "geists" haunting the mind, often distracting us from factual truths. Just as I stopped reading fiction for reasons of epistemic hygiene, I decided that chucking morality would serve a similar purpose. I certainly wouldn't trust myself to selectively filter any factual information. How can the uninformed know what to be uninformed about?