Here's the main thing that bothers me about this debate. There's a set of many different questions involving the degree of past and current warming, the degree to which such warming should be attributed to humans, the degree to which future emissions would cause more warming, the degree to which future emissions will happen given different assumptions, what good and bad effects future warming can be expected to have at different times and given what assumptions (specifically, what probability we should assign to catastrophic and even existential-risk damage), what policies will mitigate the problem how much and at what cost, how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad, and how much trust we should put in different aspects of the process that produced the standard answers to these questions and alternatives to the standard answers. These are questions that empirical evidence, theory, and scientific authority bear on to different degrees, and a LessWronger ought to separate them out as a matter of habit, and yet even here some vague combination of all these questions tends to get mashed together int...
I really like this place. What a relief to have a cogent and rational comment about the global warming debate, and how encouraging to see it lavished with a pile of karma.
Unless we [...] have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs.
Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)
Contrast that with someone who denies the existence of anthropogenic global warming (AGW)
I don't have the knowledge of climatology to make a reasoned claim about AGW myself one way or another. Whether I believe or disbelieve in AGW, it would therefore currently have to be completely done based on trusting the positions of other people. Which are indeed Bayesian evidence, but "mistrusting the current climatological elite" even if someone places a wrong prior on how likely said climatological elite is to manufacture/misinterpret data, is not remotely similar to the same sort of logical hoops that your average theist has to go through to explain and excuse the presence of evil in the world, the silence of the gods, the lack of material evidence, archaelogical and geological discrepancies with their holy texts etc, etc, etc.
So your test isn't remotely as good. It effectively tests just one thing: one's prior on how likely climatologists are to lie or misinterpret data.
Endorsing notions like "global warming is a better test of irrationality than theism" is a better test of irrationality than theism. More generally, engagement in vague tribal politics is a better test of irrationality than any object level belief. Liquor is quicker but meta is betta! Meta meta meta meta... MEH TAH! Sing it with me now!
This may be connected to a more general problem: One is trying to extrapolate on to a continuum of how rational people can be by referencing a single bit. Whether that bit is theism or AGW, that's still not going to be that helpful. More bits of data is better.
Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems.
All of these are small problems when they come up only in a narrow context. How often does someone who privileges the hypothesis only do so in a single context?
As long as we're mindkilling let's use whether someone's a republican or a democrat to gauge their rationality!
http://lesswrong.com/lw/9n/the_uniquely_awful_example_of_theism/
Tests which were proposed in the comments include whether a person favours legalization of marijuana, and whether they believe in astrology. (Well, the one about marijuana also includes value judgements: two perfectly rational agents with identical priors and access to the same evidence would agree about the possible effects of marijuana legalization but disagree about whether they're good or bad because of different utility functions.)
I'm thinking about this, and right now I think belief in astrology is the best test:
It would amuse me if there was a sizable population that thought astrology was scientific and rejected it on that basis because they don't trust science.
'disruption and public safety' sounds like it would be a kind of trouble an order of magnitude or two below trouble like 'the destroying of thousands of lives through courts & prisons'.
Well, I wonder how global warming view correlates with correct solving of problems involving Bayesian reasoning, and things like monty hall puzzle, as well as bunch of other problems that you get wrong by a fallacy. It may be more correlated than religiosity, in which case it would be a better test. Or it can be less correlated, in which case it would be a worse test. You know, we can test experimentally what is a better test.
When I opened this (already heavily downvoted) thread I was actually expecting to read an argument that belief in Global Warming would be the sign of irrationality.
IMHO, the best "test of irrationality" would be acceptance of alternative medicine.
It matters little whether you believe in global warming, but belief in homeopathy, faith healing or anything else that makes you to delay the official thing, will make difference in your life, and not for the better.
I looked into this issue and found no conclusive evidence of any global warming, let alone AGW or any catastrophic warming trends. Granted, this was several years ago. So where's the evidence? links?
There's an entire climate blogosphere out there, full of people who know more and care more, and I see no reason for people to rehash the debate here.
Nordhaus's position to me seems to be stronger than you make it out to be. Here's the thing: even in the Soviet repression some academics risked their lives to speak out. You'd expect at least that much speaking out then among academics in the relevant fields when all they have to risk is their academic careers. Yet, in the relevant disciplines, one doesn't see much of any at all.
The trouble is, the situation is fundamentally different here. If there existed some sort of crude open attempt to dictate official dogma, as in the Soviet Union, I have no doubt that a small but still non-zero minority would speak out against it, no matter what the consequences. However, in the modern academic system, there is no such thing -- rather, there is a complex system of subtle but strong perverse incentives that lead to systematic biases and a gradual drift of the academic mainstream away from reality. (Of course, the magnitude of these problems varies greatly across different fields.)
In this situation, a contrarian is faced with a situation where making fundamental criticism of the state of the field won't invite any open persecution and accusation of heresy, but it will lead to profession...
In medicine, John Ioannidis has basically built his career around exposing unpleasant truths that the perverse incentives have led the field away from. He has gotten several of his papers to various top journals, is currently a Professor of Medicine at Stanford, and been cited over 30,000 times. Isn't that evidence that you can make fundamental criticisms of the state of the field without sacrificing your career?
My intuition suggests that both in the case of Ioannidis and other somewhat similar cases - such as the WEIRD paper, which seriously questioned the generalizability of pretty much all existing psychological research, and which has been cited almost 300 times since its publication in 2010 - is that when a field is drifting away from reality, most of the people working within the field are quite aware of the fact. When somebody finally makes a clear and persuasive argument about this being the case, everyone will start citing that argument.
(Is there a set of conditions that would convince/enable you to write posts explaining to LessWrong how to engage in meta-level Hansonian/Schellingian analyses similar to the one you did in your comment? Alternatively, do you know of any public fora whose level of general intelligence and "rationality" is greater than LessWrong's? I can't immediately think of any better strategies for raising the sanity waterline than you or Steve Rayhawk writing a series of posts about signaling games, focal points, implicit decision policies, social psychology, &c., and how we should use those concepts when interpreting the social world. But of course I have no idea if that would be a good use of your time or if it'd actually have any noticeable impact. Anyway it seems possible there'd be a way to raise funds to pay you to write at least a few posts, Kickstarter style, or I could try to convince Anna and Julia from the new/upcoming Center for Modern Rationality to write up some grants for you.)
I'm going to take Steven's advice below and not recap climate discussion here. However, if you want to do your own research and make a large-stakes bet about persuading some designated neutral judges on the extent of warming in the last 100 years, structured to express the disagreement, I would probably be keen to take it.
I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality.
No conformation bias here, I am sure.
Theism is often a default test of irrationality on Less Wrong, but I propose that global warming denial would make a much better candidate.
Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems. Indeed, someone with a mild case of theism can have the same anticipations as someone without, and update their evidence in the same way. If they have moved their belief beyond refutation, in theory it thus fails to constrain their anticipations at all; and often this is the case in practice.
Contrast that with someone who denies the existence of anthropogenic global warming (AGW). This has all the signs of hypothesis privileging, but also reeks of fake justification, motivated skepticism, massive overconfidence (if they are truly ignorant of the facts of the debate), and simply the raising of politics above rationality. If I knew someone was a global warming skeptic, then I would expect them to be wrong in their beliefs and their anticipations, and to refuse to update when evidence worked against them. I would expect their judgement to be much more impaired than a theist's.
Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational. I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality. But if someone is in a milieu that discouraged belief in AGW (one stereotype being heavily Republican areas of the US) and has risen above this, then kudos to them: their acceptance of AGW is indeed a sign of rationality.