johnbr

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
johnbr30

Another test.

  1. Find out the general ideological biases of the test subject

  2. Find two studies, one (Study A) that supports the ideological biases of the test subject, but is methodologically flawed. The other (Study B) refutes the ideological biases of the subject, but is methodologically sound.

  3. Have the subject read/research information about the studies, and then ask them which study is more correct.

If you randomize this a bit (sometimes the study is both correct and "inline with one's bias") and run this multiple times on a person, you should get a pretty good read on how rational they are.

Some people might decide "Because I want to show off how rational I am, I'll accept that study X is more methodologically sound, but I'll still believe in my secret heart that Y is correct"

I'm not sure any amount of testing can handle that much self-deception, although I'm willing to be convinced otherwise :)

johnbr30

Keep track of when you change your mind about important facts based on new evidence.

a) If you rarely change your mind, you're probably not rational.

b) If you always change your mind, you're probably not very smart.

c) If you sometimes change your mind, and sometimes not, I think that's a pretty good indication that you're rational.

Of course, I feel that I fall into category (c), which is my own bias. I could test this, if there was a database of how often other people had changed their mind, cross-referenced with IQ.

Here's some examples from my own past:

  1. I used to completely discount AGW. Now I think it is occuring, but I also think that the negative feedbacks are being ignored/downplayed.

  2. I used to think that the logical economic policy was always the right one. Now, I (begrudgingly) accept that if enough people believe an economic policy is good, it will work, even though it's not logical. And, concomitantly, a logical economic policy will fail if enough people hate it.

  3. Logic is our fishtank, and we are the fish swimming in it. It is all we know. But there is a possibility that there's something outside the fishtank, that we are unable to see because of our ideological blinders.

  4. The two great stresses in ancient tribes were A) "having enough to eat" and B) "being large enough to defend the tribe from others". Those are more or less contradictory goals. But both are incredibly important. People who want to punish rulebreakers and free-riders are generally more inclined to weigh A) over B). People who want to grow the tribe, by being more inclusive and accepting of others are more inclined to weight B) over A).

  5. None of the modern economic theories seem to be any good at handling crises. I used to think that Chicago and Austrian schools had better answers than Keynesians.

  6. I used to think that banks should have just been allowed to die, now I'm not so sure - I see a fair amount of evidence that the logical process there would have caused a significant panic. Not sure either way.

johnbr30

The most important thing for me, is the near-far bias - even though that's a relatively recent "discovery" here, it still resonates very well with why I argue with people about things, and why people who I respect argue with each other.

johnbr70

Most frequently useful - that my interest in being unbiased can become a sort of bias of its own, when I hear arguments from others, I can easily spot the biases, and I've worked hard to recognize that I have built-in biases as well that I can't discount.