So you think you want to be rational, to believe what is true even when sirens tempt you? Great, get to work; there's lots you can do. Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate? Hold on; this is hard.
Humans nearly universally find excuses to believe that they are more correct that others, at least on the important things. They point to others' incredible beliefs, to biases afflicting others, and to estimation tasks where they are especially skilled. But they forget most everyone can point to such things.
But shouldn't you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like? Well this would be good evidence of your rationality if you were in fact pretty rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into your reasoning processes.
But what if your mind is far from rational? What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies?
It seems to me that if you are serious about actually being rational, rather than just believing in your rationality or joining a group that thinks itself rational, you should try hard and often to test your rationality. But how can you do that?
To test the rationality of your beliefs, you could sometimes declare beliefs, and later score those beliefs via tests where high scoring beliefs tend to be more rational. Better tests are those where scores are more tightly and reliably correlated with rationality. So, what are good rationality tests?
This almost seems too obvious to mention in one of Robin's threads, but I'll go ahead anyway: success on prediction markets would seem to be an indicator of rationality and/or luck. Your degree of success in a game like HubDub may give some indication as to the accuracy of your beliefs, and so (one would hope) the effectiveness of your belief-formation process.
I would expect success in a prediction market to be more correlated with amount of time spent researching than with rationality. At best, rationality would be a multiplier to the benefit gained per hour of research; alternatively, it could be an upper bound to the total amount of benefit gained from researching.