There is quite a gap between wanting to be rational and wanting to know how unbiased you are. Since the test is self-administered, pursuing the first desire could easily lead to a favourable, biased, seemingly rational test result. This result would be influenced by personal expectations, and it's reliability is null according to Löb's Theorem. The latter desire implies one being open to his biased state and states his purpose of assessing some sort of bias/rational balance. This endeavour is more profitable than the previous because, hopefully, it offers actionable information.
Perhaps one could have a good shot at finding out more about his biases by making quick judgements and later trying to contemplate various aspects and sequences of his or her judgement with accounting of seemingly absurd alternatives and attention paid to the smallest of details. The result should occur as a percentage of correct/faulty conclusions. Apart from discovering some sort of rational/biased ratio in a line of thought, this process should automatically bring one closer to being rational by the memorizing of judgement flaws, their sources and pattern, and by the development of a habit for righteous thinking from a rationality point of view.
This test could have a much more reliable result when performed on someone else by providing all necessary information for a right conclusion to be reached together with vague, inconclusive information for incorrect conclusions to be reached, and great incentives for reaching some of the wrong conclusions.
Speaking of incentives, I believe anyone trying to be as rational as possible within a group could be influenced by group values and beliefs. Therefore, trying to find out biases within the group's/group members' judgements could be correlated with one's affinity for that group. Rationality should be neutral, but neutrality is seldom a group value so chances are high that instinctive-rationalists will be outliers. The tendency to agree with beliefs is probably as wrong as the tendency of finding biases, the two depending on one's grade of sympathy for a specific group.
Identifying exterior biases will be an unreliable measure of one's rationality, because of the incentives which exist in interacting with others and also because there is usually little information on exterior thought processes which led to specific outcomes. Also, beliefs widely spread across a social system can have consequences that seemingly prove those beliefs even without their being rational, in which case, comparing one's judgement to facts would be an indicator of power rather than rationality.
I'm inclined to believe that rationality is more an instrument rather than a goal, as you try to describe it. Being attached to material trinkets, (or not) will be a rational choice for the one who developed his rationality and was able to think his choice through, while irrationally dismissing the utility of mundane gadgetry as well as wholeheartedly embracing it, most likely as a result of an induced bias, exposes the undertaker to unconsidered, not-yet-evaluated risks - hence the label "irrationally".
There is some seed of truth in what you're saying - the balance between the effort of developing a rational art and the likely impact of that development on one's goal has to receive the necessary attention.
To go with the example provided (the body-builder [the rationalist jedi]) - going straight towards his final goal (obtaining an Adonis physique [being a rational jedi]) will help him develop more muscle mass [more powerful rational skills] which would mean more fat-burning cells in his body [more chances to make the right decisions when various day-to-day challenges arise] to deal with the extra 200 pounds [whatever skewed perception or behavioural pattern one has], which, in my opinion is more close to optimal than a simple diet [blunt choice of "what is right" based on commonly-accepted opinion].