So you think you want to be rational, to believe what is true even when sirens tempt you? Great, get to work; there's lots you can do. Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate? Hold on; this is hard.
Humans nearly universally find excuses to believe that they are more correct that others, at least on the important things. They point to others' incredible beliefs, to biases afflicting others, and to estimation tasks where they are especially skilled. But they forget most everyone can point to such things.
But shouldn't you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like? Well this would be good evidence of your rationality if you were in fact pretty rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into your reasoning processes.
But what if your mind is far from rational? What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies?
It seems to me that if you are serious about actually being rational, rather than just believing in your rationality or joining a group that thinks itself rational, you should try hard and often to test your rationality. But how can you do that?
To test the rationality of your beliefs, you could sometimes declare beliefs, and later score those beliefs via tests where high scoring beliefs tend to be more rational. Better tests are those where scores are more tightly and reliably correlated with rationality. So, what are good rationality tests?
"Do you want to justifiably believe that you are more rational than others, smugly knowing your beliefs are more accurate?"
Is this what people want? To me it would make more sense to cultivate the belief that one is NOT more rational than others, and that one's beliefs are no more likely than theirs to be accurate, a priori. Try to overcome the instinct that a belief is probably correct merely because it is yours.
Now I can understand that for people at the cutting edge of society, pushing into new frontiers like Robin and Eliezer, this would not work. If someone came up to Robin and criticized idea futures, or to Eliezer and said that friendly AI would not work, and they responded, "oh, I guess maybe you're right, thanks" - well, then, they wouldn't get anything done.
But for most of us, this is not an issue. Factual disagreements in my experience are seldom about things that would keep us from being productive and successful in our lives. People tend to disagree most vociferously on things that don't have the slightest impact on their lives, like political and sports questions. Isn't that right?
Even for researchers, in a way it doesn't matter because we are paying them to push the boundaries. It is their job to adopt opinions and fight for them. They are obligated to assume that just because an idea is theirs, it is probably right. Researchers are paid to be irrational in this way, and indeed it is hard to see how a rational person could be successful in science.