What quick-and-easy rules of thumb to gauge how rational someone else is do you tend to use? How accurate do you think those rules are, and can you think of any way they might be improved?

 

For some examples of what I mean, one of the benchmarks I use is the basic skeptics' list: astrology, chiropractic, little green men abducting cattle and performing anal probes, Nessie. Another is the denialist checklist: holocaust denial, moon landing denial, global warming denial. Another is supernaturalism in general: creationism, intercessory prayer, magick, psychics, curses, ghosts, and such. If I find out that anyone I know believes in any of that, then my estimation of how well they can consider things rationally goes down. Theism... well, I've gotten used to pretty much everyone around me being theistic, so that's kind of the baseline I assume; when I learn someone is an atheist, my estimation of their rationality tends to go /up/.

Do you have any items which make you think someone is even further along the path of rationality than simply not believing logical fallacies?

New Comment
14 comments, sorted by Click to highlight new comments since:
[-]hamnox110

Here's a big one for me: Whether or not someone shows a rudimentary understanding of how their own brains can mislead them.

It's easy enough to see it working in teenagers. They're the ones who realize that their emotions are going to be completely out of whack, their judgment may not in fact be 10x better than every adult around them, and proceed to compensate for it where they can.

It's not the same as knowing when to shout "Anchoring!" or "Sunk Cost Fallacy!". That's just knowing the password. It's a matter of being aware that your brain can think and feel things without consulting you, and not all of the things it thinks for you are good or right.

A fascinating thing with teenagers: in their youths, they are exposed to many success-stories. These success-stories will often involve bragging about the mistakes they made as teenagers. I wonder, how many teens update on this sort of evidence? I know I certainly did.

A necessary condition is having an intuitive understanding of how to properly reply to hypotheticals or thought experiments in a discussion. If someone keeps focusing on convenient possible worlds, they probably have a long way to go.

Having such understanding is not a sufficient condition for rationality, because many philosophy majors internalize these habits of thought, and I don't think that being a philosophy major is strongly indicative of rationality.

I was hoping this would be about judging one's own rationality. Judging others seems to be a lot easier.

Stuff that's just identification with an established subculture like New Agers or conspiracy cranks mostly shows that the person likes to identify with that subculture. It's a lot trickier to find stuff that you'd hope people to get right but which aren't strongly tied to a specific subculture attire.

Well, after the list given in the OP (which, while they are in fact necessary conditions for rationality, seem to me to not even constitute a "lowest standard"; they're the surface-level attributes that are adopted almost automatically upon entering into the sceptic community) I tend to use their reaction when I say "everyone should be immortal". Strangely enough it does seem like you need an abnormal clarity of thought to reliably come to the right conclusion about death.

Beware inferential distance. "Everyone should be immortal" includes a lot of unstated assumptions that the person you say it to may not be aware of. They could easily think you mean "Everyone should basically be as they are now, except live forever", which would mean either malthusian misery or draconian restrictions on reproduction. Unless you have already discussed tranhumanism with them, this is a terrible benchmark.

That's not the way I usually phrase it - I don't know how that would fit into a conversation anyway. I was just summarising the subject matter. Sorry for the confusion.

"Everyone should be immortal" is a claim about values, not facts. There's no such thing as "the right conclusion about death".

I tend to use their reaction when I say "everyone should be immortal". Strangely enough it does seem like you need an abnormal clarity of thought to reliably come to the right conclusion about death.

I see this general idea espoused by rationalists rather often. But despite my months on here, I have yet to change my mind into agreement on this.

Aumann's Agreement Theorem leaves us with three options:

  1. The vast majority of LW-ers are irrational (I rather doubt it)
  2. I am not as rational as I would like to be (I'm sure of it)
  3. We do not have common priors (I do think that most anti-deathists are very privileged in terms of: intelligence, wealth, stability, etc)

not even constitute a "lowest standard"

There are a great many people who don't meet the complete standard - in fact, the great majority of people don't; and it seems worthwhile to be able to differentiate between a reasonably rationalist deist and a Californian cloud cookoo-lander.

Of course, any way to differentiate amongst people who do meet the 'lowest standard' is valuable, as well.

I believe APMason's point is that your benchmarks are testing for anti-non-mainstreamism

What they do when they're wrong about something immediately available to them (so, reading a map wrong, not being wrong about global warming).

1) Why was this post downvoted?

2) I've realized that I take the far easier path: I simply downgrade my model of someone's rationality based on what I judge to be irrationality. The benchmark of a rational person is thus simply an apparent absence of such things.

For example, I notice belief in belief a lot, leading to confused minds: I'm thinking of one person who rationalizes incessantly. Or I may notice insufficiently clear thinking, often related to the use of passwords, along with the implicit disagreement with the creed that what the truth can destroy, it should. I'm thinking of one of my professors who clearly is opposed to reductionist cognitive psychology for what seems to me primarily wishful thinking rather than good reason, and approvingly notes that some cognitive processes can be 'emergent', and so on.