So you think you want to be rational, to believe what is true even when sirens tempt you? Great, get to work; there's lots you can do. Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate? Hold on; this is hard.
Humans nearly universally find excuses to believe that they are more correct that others, at least on the important things. They point to others' incredible beliefs, to biases afflicting others, and to estimation tasks where they are especially skilled. But they forget most everyone can point to such things.
But shouldn't you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like? Well this would be good evidence of your rationality if you were in fact pretty rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into your reasoning processes.
But what if your mind is far from rational? What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies?
It seems to me that if you are serious about actually being rational, rather than just believing in your rationality or joining a group that thinks itself rational, you should try hard and often to test your rationality. But how can you do that?
To test the rationality of your beliefs, you could sometimes declare beliefs, and later score those beliefs via tests where high scoring beliefs tend to be more rational. Better tests are those where scores are more tightly and reliably correlated with rationality. So, what are good rationality tests?
I think there is likely a distinction between being rational at games and rational at life. In my experience those who are rational in one way are very often not rational in the other. I think it highly unlikely that there is a strong correlation between "good at prediction markets" or "good at poker" and "good at life". Do we think the best poker players are good models for rational existence? I don't think I do and I don't even think THEY do.
A suggestion:
List your goals. Then give the goals deadlines along with probabilities of success and estimated utility (with some kind of metric, not necessarily numerical). At each deadline, tally whether or not the goal is completed and give an estimation or the utility.
From this information you can take at least three things.
The process could probably be improved if it was done transparently and cooperatively. Others looking on would help prevent you from cheating yourself.
Not terribly rigorous, but thats the idea.
I'm not sure if this works, since a lazy person could score very low on utility yet still be quite rational.