All of johnbr's Comments + Replies

johnbr30

Another test.

  1. Find out the general ideological biases of the test subject

  2. Find two studies, one (Study A) that supports the ideological biases of the test subject, but is methodologically flawed. The other (Study B) refutes the ideological biases of the subject, but is methodologically sound.

  3. Have the subject read/research information about the studies, and then ask them which study is more correct.

If you randomize this a bit (sometimes the study is both correct and "inline with one's bias") and run this multiple times on a person, yo... (read more)

2j03
1. How do you know your determination of "ideological bias" isn't biased itself? 2. All experiments are flawed in one way or another to some degree. Are you saying one study is more methodologically flawed than another? How do you measure the degree of the flaws? How do you know your determination of flaws isn't biased? 3. Again, you've already decided the which study is "correct" based on your own biased interpretations. How do you prove the other person is wrong and it's not you that is biased? I agree with the randomize and repeat bit though. However, I would like to propose that this test methodology for rationality is deeply flawed.
johnbr30

Keep track of when you change your mind about important facts based on new evidence.

a) If you rarely change your mind, you're probably not rational.

b) If you always change your mind, you're probably not very smart.

c) If you sometimes change your mind, and sometimes not, I think that's a pretty good indication that you're rational.

Of course, I feel that I fall into category (c), which is my own bias. I could test this, if there was a database of how often other people had changed their mind, cross-referenced with IQ.

Here's some examples from my own past... (read more)

0adamisom
I'm not sure about this. The words are vague enough that I think we'll usually see ourselves as only sometimes changing our mind. That becomes the new happy medium that we all think we've achieved, simply because we're too ignorant on what it actually means to change your beliefs the right amount that we think. I'm having a hard time knowing how I could decide if I'm changing my beliefs the right amount; since that would be a (very rough) estimation of an indirect indicator, I feel like I have to disagree with the potential of this idea.
-6j03
johnbr30

The most important thing for me, is the near-far bias - even though that's a relatively recent "discovery" here, it still resonates very well with why I argue with people about things, and why people who I respect argue with each other.

johnbr70

Most frequently useful - that my interest in being unbiased can become a sort of bias of its own, when I hear arguments from others, I can easily spot the biases, and I've worked hard to recognize that I have built-in biases as well that I can't discount.