Martial arts can be a good training to ensure your personal security, if you assume the worst about your tools and environment. If you expect to find yourself unarmed in a dark alley, or fighting hand to hand in a war, it makes sense. But most people do a lot better at ensuring their personal security by coordinating to live in peaceful societies and neighborhoods; they pay someone else to learn martial arts. Similarly, while "survivalists" plan and train to stay warm, dry, and fed given worst case assumptions about the world around them, most people achieve these goals by participating in a modern economy.
The martial arts metaphor for rationality training seems popular at this website, and most discussions here about how to believe the truth seem to assume an environmental worst case: how to figure out everything for yourself given fixed info and assuming the worst about other folks. In this context, a good rationality test is a publicly-visible personal test, applied to your personal beliefs when you are isolated from others' assistance and info.
I'm much more interested in how we can can join together to believe truth, and it actually seems easier to design institutions which achieve this end than to design institutions to test individual isolated general tendencies to discern truth. For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics. We don't each need to train to identify and fix each possible kind of bias; each bias can instead have specialists who look for where that bias appears and then correct it.
Perhaps martial-art-style rationality makes sense for isolated survivalist Einsteins forced by humanity's vast stunning cluelessness to single-handedly block the coming robot rampage. But for those of us who respect the opinions of enough others to want to work with them to find truth, it makes more sense to design and field institutions which give each person better incentives to update a common consensus.
Eliezer, to the extent that any epistemic progress has been made at all, was it not ever thus?
To give one example: the scientific method is an incredibly powerful tool for generating knowledge, and has been very widely accepted as such for the past two centuries.
But even a cursory reading of the history of science reveals that scientists themselves, despite having great taste in rationalist institutions, often had terrible taste in personal rationality. They were frequently petty, biased, determined to believe their own theories regardless of evidence, defamatory and aggressive towards rival theorists, etc.
Ultimately, their taste in rational institutions coexisted with frequent lack of taste in personal rationality (certainly, a lack of Eliezer-level taste in personal rationality). It would have been better, no doubt, if they had had both tastes. But they didn't. But in the end, it wasn't necessary that they did.
I would also make some other points:
1. People tend to have stronger emotive attachments - and hence stronger biases - in relation to concrete issues (e.g. "is the theory I believe correct") than epistemic institutions (e.g. "should we do an experiment to confirm the theory"). One reason is that such object level issues tend to be more politicised. Another is that they tend to have a more direct, concrete impact on individual lives (N.B. the actual impact of epistemic institutions is probably much greater, but for triggering our biases, the appearance of direct action is more important (cf thought experiments about sacrificing a single identifiable child to save faceless millions)).
2. Even very object-level biased people can be convinced to follow the same institutional epistemic framework. After all, if they are convinced that the framework is a truth-productive one, they will believe it will ultimately vindicate their theory. I think this is a key reason why competing ideologies agree to free speech, why competing scientists agree to the scientific method, why (by analogy) competing companies agree to free trade, etc.
[The question of what happens when one person's theory begins to lose out under the framework is a different one, but by that stage, if enough people are following the epistemic framework, opting out may be socially impossible (e.g. if a famous scientist said "my theory has been falsified by experiment, so I am abandoning the scientific method!", they would be a laughing stock)]
3. I really worry that "everyone on Earth is irrational, apart from me and my mates" is an incredibly gratifying and tempting position to hold. The romance of the lone point of light in an ocean of darkness! The drama of leading the fight to begin civilisation itself! The thrill of the hordes of Dark Side Epistemologists, surrounding the besieged outpost of reason! Who would not be tempted? I certainly am. But that is why I suspect.