A few examples (in approximately increasing order of controversy):
If you proceed anyway...
- Identify knowledge that may be dangerous. Forewarned is forearmed.
- Try to cut dangerous knowledge out of your decision network. Don’t let it influence other beliefs or your actions without your conscious awareness. You can’t succeed completely at this, but it might help.
- Deliberately lower dangerous priors, by acknowledging the possibility that your brain is contaminating your reasoning and then overcompensating, because you know that you’re still too overconfident.
- Spend a disproportionate amount of time seeking contradictory evidence. If believing something could have a great cost to your values, make a commensurately great effort to be right.
- Just don’t do it. It’s not worth it. And if I found out, I’d have to figure out where you live, track you down, and kill you.
Thanks to Heisenberg your information is also always incomplete. In real life you do have insufficient math/computing ability to simulate the interactions of many systems.
Whether weak reductionism is true doesn't matter much for this debate. People who believe in strong reductionism find appeal in Pua theory.
They believe that they have sufficient mental resources and information to calculate complex social interactions in a way that allows them to optimize those interactions.
Because of the belief in strong reductionism they believe in Pua based on anecdotal evidence and don't believe in acupuncture based on anecdotal evidence.