So you think you want to be rational, to believe what is true even when sirens tempt you? Great, get to work; there's lots you can do. Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate? Hold on; this is hard.
Humans nearly universally find excuses to believe that they are more correct that others, at least on the important things. They point to others' incredible beliefs, to biases afflicting others, and to estimation tasks where they are especially skilled. But they forget most everyone can point to such things.
But shouldn't you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like? Well this would be good evidence of your rationality if you were in fact pretty rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into your reasoning processes.
But what if your mind is far from rational? What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies?
It seems to me that if you are serious about actually being rational, rather than just believing in your rationality or joining a group that thinks itself rational, you should try hard and often to test your rationality. But how can you do that?
To test the rationality of your beliefs, you could sometimes declare beliefs, and later score those beliefs via tests where high scoring beliefs tend to be more rational. Better tests are those where scores are more tightly and reliably correlated with rationality. So, what are good rationality tests?
Karma-score (and voting up/down) could also be a measure of rationality contra affiliation. Movement in itself being more important than the direction of the movement as a clue to your affiliation drive or rationality drive - given a little context and some scrupulous introspection.
I don't know if karma is itself a good measure of rationality, but it might be a good subject to train calibration on. E.g., whenever you make a post or comment there could be an optional field where you put in your expectation and SD for what the post's or the comment's score will be one week later.