Theism is often a default test of irrationality on Less Wrong, but I propose that global warming denial would make a much better candidate.
Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems. Indeed, someone with a mild case of theism can have the same anticipations as someone without, and update their evidence in the same way. If they have moved their belief beyond refutation, in theory it thus fails to constrain their anticipations at all; and often this is the case in practice.
Contrast that with someone who denies the existence of anthropogenic global warming (AGW). This has all the signs of hypothesis privileging, but also reeks of fake justification, motivated skepticism, massive overconfidence (if they are truly ignorant of the facts of the debate), and simply the raising of politics above rationality. If I knew someone was a global warming skeptic, then I would expect them to be wrong in their beliefs and their anticipations, and to refuse to update when evidence worked against them. I would expect their judgement to be much more impaired than a theist's.
Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational. I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality. But if someone is in a milieu that discouraged belief in AGW (one stereotype being heavily Republican areas of the US) and has risen above this, then kudos to them: their acceptance of AGW is indeed a sign of rationality.
(Is there a set of conditions that would convince/enable you to write posts explaining to LessWrong how to engage in meta-level Hansonian/Schellingian analyses similar to the one you did in your comment? Alternatively, do you know of any public fora whose level of general intelligence and "rationality" is greater than LessWrong's? I can't immediately think of any better strategies for raising the sanity waterline than you or Steve Rayhawk writing a series of posts about signaling games, focal points, implicit decision policies, social psychology, &c., and how we should use those concepts when interpreting the social world. But of course I have no idea if that would be a good use of your time or if it'd actually have any noticeable impact. Anyway it seems possible there'd be a way to raise funds to pay you to write at least a few posts, Kickstarter style, or I could try to convince Anna and Julia from the new/upcoming Center for Modern Rationality to write up some grants for you.)
Thanks for the kind words, but I wouldn't be able to allocate enough time for such a project at the present moment. In fact, I've had plans to write something along these lines for quite a while, but original articles take much more time than comments. (And I've barely had any time even for comments in recent months.)
Also, realistically, I'm not sure how successful the product would be. I don't have much talent for writing in an engaging way, which is further exacerbated by English not being my native language. So I think that even with the best possible outcome, not very many people would end up reading it.