I usually define "rationality" as accuracy-seeking whenever decisional considerations do not enter. These days I sometimes also use the phrase "epistemic rationality".
It would indeed be more complicated if we began conducting the meta-argument that (a) an ideal Bayesian not faced with various vengeful gods inspecting its algorithm should not decide to rewrite its memories to something calibrated away from what it originally believed to be accurate, or that (b) human beings ought to seek accuracy in a life well-lived according to goals that include both explicit truth-seeking and other goals not about truth.
But unless I'm specifically focused on this argument, I usually go so far as to talk as if it resolves in favor of epistemic accuracy, that is, that pragmatic rationality is unified with epistemic rationality rather than implying two different disciplines. If truth is a bad idea, it's not clear what the reader is doing on Less Wrong, and indeed, the "pragmatic" reader who somehow knows that it's a good idea to be ignorant, will at once flee as far as possible...
You started off using the word "rationality" on this blog/forum, and though I had misgivings, I tried to continue with your language. But most of the discussion of this post seems to be distracted by my having tried to clarify that in the introductory sentence. I predict we won't be able to get past this, and so from now on I will revert to my usual policy of avoiding overloaded words like "rationality."
The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources.
Rationality certainly can have instrumental advantages. There are plenty of situations where being more rational helps one achieve a wide range of goals. In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational. In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.
But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us. So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.
Yes, rationality might help you win that game or argument, get promoted, or win her heart. Or more rationality for you might hinder those outcomes. If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends. In fact we often know it is not.
The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful. These are actually pretty likely outcomes in many identifiable situations. You may think you want to know the truth no matter what, but how sure can you really be of that? Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for.
Be warned; the truth just is what it is. If just knowing the truth is not reward enough, perhaps you'd be better off not knowing. Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics? Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?