Tl;dr: Since it can be cheap and easy to attack everything your tribe doesn't believe, you shouldn't trust the rationality of just anyone who slams astrology and creationism; these beliefs aren't just false, they're also non-tribal among educated audiences. Test what happens when a "skeptic" argues for a non-tribal belief, or argues against a tribal belief, before you decide they're good general rationalists. This post is intended to be reasonably accessible to outside audiences.
I don't believe in UFOs. I don't believe in astrology. I don't believe in homeopathy. I don't believe in creationism. I don't believe there were explosives planted in the World Trade Center. I don't believe in haunted houses. I don't believe in perpetual motion machines. I believe that all these beliefs are not only wrong but visibly insane.
If you know nothing else about me but this, how much credit should you give me for general rationality?
Certainly anyone who was skillful at adding up evidence, considering alternative explanations, and assessing prior probabilities, would end up disbelieving in all of these.
But there would also be a simpler explanation for my views, a less rare factor that could explain it: I could just be anti-non-mainstream. I could be in the habit of hanging out in moderately educated circles, and know that astrology and homeopathy are not accepted beliefs of my tribe. Or just perceptually recognize them, on a wordless level, as "sounding weird". And I could mock anything that sounds weird and that my fellow tribesfolk don't believe, much as creationists who hang out with fellow creationists mock evolution for its ludicrous assertion that apes give birth to human beings.
You can get cheap credit for rationality by mocking wrong beliefs that everyone in your social circle already believes to be wrong. It wouldn't mean that I have any ability at all to notice a wrong belief that the people around me believe to be right, or vice versa - to further discriminate truth from falsity, beyond the fact that my social circle doesn't already believe in something.
Back in the good old days, there was a simple test for this syndrome that would get quite a lot of mileage: You could just ask me what I thought about God. If I treated the idea with deeper respect than I treated astrology, holding it worthy of serious debate even if I said I disbelieved in it, then you knew that I was taking my cues from my social surroundings - that if the people around me treated a belief as high-prestige, high-status, I wouldn't start mocking it no matter what the state of evidence.
On the other hand suppose I said without hesitation that my epistemic state on God was similar to my epistemic state on psychic powers: no positive evidence, lots of failed tests, highly unfavorable prior, and if you believe it under those circumstances then something is wrong with your mind. Then you would have heard a bit of skepticism that might cost me something socially, and that not everyone around me would have endorsed, even in educated circles. You would know it wasn't just a cheap way of picking up cheap points.
Today the God-test no longer works, because some people realized that the taking-it-seriously aura of religion is in fact the main thing left which prevents people from noticing the epistemic awfulness; there has been a concerted and, I think, well-advised effort to mock religion and strip it of its respectability. The upshot is that there are now quite wide social circles in which God is just another stupid belief that we all know we don't believe in, on the same list with astrology. You could be dealing with an adept rationalist, or you could just be dealing with someone who reads Reddit.
And of course I could easily go on to name some beliefs that others think are wrong and that I think are right, or vice versa, but would inevitably lose some of my audience at each step along the way - just as, a couple of decades ago, I would have lost a lot of my audience by saying that religion was unworthy of serious debate. (Thankfully, today this outright dismissal is at least considered a respectable, mainstream position even if not everyone holds it.)
I probably won't lose much by citing anti-Artificial-Intelligence views as an example of undiscriminating skepticism. I think a majority among educated circles are sympathetic to the argument that brains are not magic and so there is no obstacle in principle to building machines that think. But there are others, albeit in the minority, who recognize Artificial Intelligence as "weird-sounding" and "sci-fi", a belief in something that has never yet been demonstrated, hence unscientific - the same epistemic reference class as believing in aliens or homeopathy.
(This is technically a demand for unobtainable evidence. The asymmetry with homeopathy can be summed up as follows: First: If we learn that Artificial Intelligence is definitely impossible, we must have learned some new fact unknown to modern science - everything we currently know about neurons and the evolution of intelligence suggests that no magic was involved. On the other hand, if we learn that homeopathy is possible, we must have learned some new fact unknown to modern science; if everything else we believe about physics is true, homeopathy shouldn't work. Second: If homeopathy works, we can expect double-blind medical studies to demonstrate its efficacy right now; the absence of this evidence is very strong evidence of absence. If Artificial Intelligence is possible in theory and in practice, we can't necessarily expect its creation to be demonstrated using current knowledge - this absence of evidence is only weak evidence of absence.)
I'm using Artificial Intelligence as an example, because it's a case where you can see some "skeptics" directing their skepticism at a belief that is very popular in educated circles, that is, the nonmysteriousness and ultimate reverse-engineerability of mind. You can even see two skeptical principles brought into conflict - does a good skeptic disbelieve in Artificial Intelligence because it's a load of sci-fi which has never been demonstrated? Or does a good skeptic disbelieve in human exceptionalism, since it would require some mysterious, unanalyzable essence-of-mind unknown to modern science?
It's on questions like these where we find the frontiers of knowledge, and everything now in the settled lands was once on the frontier. It might seem like a matter of little importance to debate weird non-mainstream beliefs; a matter for easy dismissals and open scorn. But if this policy is implemented in full generality, progress goes down the tubes. The mainstream is not completely right, and future science will not just consist of things that sound reasonable to everyone today - there will be at least some things in it that sound weird to us. (This is certainly the case if something along the lines of Artificial Intelligence is considered weird!) And yes, eventually such scientific truths will be established by experiment, but somewhere along the line - before they are definitely established and everyone already believes in them - the testers will need funding.
Being skeptical about some non-mainstream beliefs is not a fringe project of little importance, not always a slam-dunk, not a bit of occasional pointless drudgery - though I can certainly understand why it feels that way to argue with creationists. Skepticism is just the converse of acceptance, and so to be skeptical of a non-mainstream belief is to try to contribute to the project of advancing the borders of the known - to stake an additional epistemic claim that the borders should not expand in this direction, and should advance in some other direction instead.
This is high and difficult work - certainly much more difficult than the work of mocking everything that sounds weird and that the people in your social circle don't already seem to believe.
To put it more formally, before I believe that someone is performing useful cognitive work, I want to know that their skepticism discriminates truth from falsehood, making a contribution over and above the contribution of this-sounds-weird-and-is-not-a-tribal-belief. In Bayesian terms, I want to know that p(mockery|belief false & not a tribal belief) > p(mockery|belief true & not a tribal belief).
If I recall correctly, the US Air Force's Project Blue Book, on UFOs, explained away as a sighting of the planet Venus what turned out to actually be an experimental aircraft. No, I don't believe in UFOs either; but if you're going to explain away experimental aircraft as Venus, then nothing else you say provides further Bayesian evidence against UFOs either. You are merely an undiscriminating skeptic. I don't believe in UFOs, but in order to credit Project Blue Book with additional help in establishing this, I would have to believe that if there were UFOs then Project Blue Book would have turned in a different report.
And so if you're just as skeptical of a weird, non-tribal belief that turns out to have pretty good support, you just blew the whole deal - that is, if I pay any extra attention to your skepticism, it ought to be because I believe you wouldn't mock a weird non-tribal belief that was worthy of debate.
Personally, I think that Michael Shermer blew it by mocking molecular nanotechnology, and Penn and Teller blew it by mocking cryonics (justification: more or less exactly the same reasons I gave for Artificial Intelligence). Conversely, Richard Dawkins scooped up a huge truckload of actual-discriminating-skeptic points, at least in my book, for not making fun of the many-worlds interpretation when he was asked about in an interview; indeed, Dawkins noted (correctly) that the traditional collapse postulate pretty much has to be incorrect. The many-worlds interpretation isn't just the formally simplest explanation that fits the facts, it also sounds weird and is not yet a tribal belief of the educated crowd; so whether someone makes fun of MWI is indeed a good test of whether they understand Occam's Razor or are just mocking everything that's not a tribal belief.
Of course you may not trust me about any of that. And so my purpose today is not to propose a new litmus test to replace atheism.
But I do propose that before you give anyone credit for being a smart, rational skeptic, that you ask them to defend some non-mainstream belief. And no, atheism doesn't count as non-mainstream anymore, no matter what the polls show. It has to be something that most of their social circle doesn't believe, or something that most of their social circle does believe which they think is wrong. Dawkins endorsing many-worlds still counts for now, although its usefulness as an indicator is fading fast... but the point is not to endorse many-worlds, but to see them take some sort of positive stance on where the frontiers of knowledge should change.
Don't get me wrong, there's a whole crazy world out there, and when Richard Dawkins starts whaling on astrology in "The Enemies of Reason" documentary, he is doing good and necessary work. But it's dangerous to let people pick up too much credit just for slamming astrology and homeopathy and UFOs and God. What if they become famous skeptics by picking off the cheap targets, and then use that prestige and credibility to go after nanotechnology? Who will dare to consider cryonics now that it's been featured on an episode of Penn and Teller's "Bullshit"? On the current system you can gain high prestige in the educated circle just by targeting beliefs like astrology that are widely believed to be uneducated; but then the same guns can be turned on new ideas like the many-worlds interpretation, even though it's being actively debated by physicists. And that's why I suggest, not any particular litmus test, but just that you ought to have to stick your neck out and say something a little less usual - say where you are not skeptical (and most of your tribemates are) or where you are skeptical (and most of the people in your tribe are not).
I am minded to pay attention to Robyn Dawes as a skillful rationalist, not because Dawes has slammed easy targets like astrology, but because he also took the lead in assembling and popularizing the total lack of experimental evidence for nearly all schools of psychotherapy and the persistence of multiple superstitions such as Rorschach ink-blot interpretation in the face of literally hundreds of experiments trying and failing to find any evidence for it. It's not that psychotherapy seemed like a difficult target after Dawes got through with it, but that, at the time he attacked it, people in educated circles still thought of it as something that educated people believed in. It's not quite as useful today, but back when Richard Feynman published "Surely You're Joking, Mr. Feynman" you could pick up evidence that he was actually thinking from the fact that he disrespected psychotherapists as well as psychics.
I'll conclude with some simple and non-trustworthy indicators that the skeptic is just filling in a cheap and largely automatic mockery template:
- The "skeptic" opens by remarking about the crazy true believers and wishful thinkers who believe in X, where there seem to be a surprising number of physicists making up the population of those wacky cult victims who believe in X. (The physicist-test is not an infallible indicator of rightness or even non-stupidity, but it's a filter that rapidly picks up on, say, strong AI, molecular nanotechnology, cryonics, the many-worlds interpretation, and so on.) Bonus point losses if the "skeptic" remarks on how easily physicists are seduced by sci-fi ideas. The reason why this is a particularly negative indicator is that when someone is in a mode of automatically arguing against everything that seems weird and isn't a belief of their tribe - of rejecting weird beliefs as a matter of naked perceptual recognition of weirdness - then they tend to perceptually fill-in-the-blank by assuming that anything weird is believed by wacky cult victims (i.e., people Not Of Our Tribe). And they don't backtrack, or wonder otherwise, even if they find out that the "cult" seems to exhibit a surprising number of people who go around talking about rationality and/or members with PhDs in physics. Roughly, they have an automatic template for mocking weird beliefs, and if this requires them to just swap in physicists for astrologers as gullible morons, that's what they'll do. Of course physicists can be gullible morons too, but you should be establishing that as a surprising conclusion, not using it as an opening premise!
- The "skeptic" offers up items of "evidence" against X which are not much less expected in the case that X is true than in the case that X is false; in other words, they fail to grasp the elementary Bayesian notion of evidence. I don't believe that UFOs are alien visitors, but my skepticism has nothing to do with all the crazy people who believe in UFOs - the existence of wacky cults is not much less expected in the case that aliens do exist, than in the case that they do not. (I am skeptical of UFOs, not because I fear affiliating myself with the low-prestige people who believe in UFOs, but because I don't believe aliens would (a) travel across interstellar distances AND (b) hide all signs of their presence AND THEN (c) fly gigantic non-nanotechnological aircraft over our military bases with their exterior lights on.)
- The demand for unobtainable evidence is a special case of the above, and of course a very common mode of skepticism gone wrong. Artificial Intelligence and molecular nanotechnology both involve beliefs in the future feasibility of technologies that we can't build right now, but (arguendo) seem to be strongly permitted by current scientific belief, i.e., the non-ineffability of the brain, or the basic physical calculations which seem to show that simple nanotechnological machines should work. To discard all the arguments from cognitive science and rely on the knockdown argument "no reliable reporter has ever seen an AI!" is blindly filling in the template from haunted houses.
- The "skeptic" tries to scare you away from the belief in their very first opening remarks: for example, pointing out how UFO cults beat and starve their victims (when this can just as easily happen if aliens are visiting the Earth). The negative consequences of a false belief may be real, legitimate truths to be communicated; but only after you establish by other means that the belief is factually false - otherwise it's the logical fallacy of appeal to consequences.
- They mock first and counterargue later or not at all. I do believe there's a place for mockery in the war on dumb ideas, but first you write the crushing factual counterargument, then you conclude with the mockery.
I'll conclude the conclusion by observing that poor skepticism can just as easily exist in a case where a belief is wrong as when a belief is right, so pointing out these flaws in someone's skepticism can hardly serve to establish a positive belief about where the frontiers of knowledge should move.
I wouldn't answer the astrology/UFO question. Extraterrestrials visiting in flying human-vehicle-sized ships from human-visible distances is so horribly anthropomorphic as to make it immeasurably improbable. Both propositions are far less likely than me winning the lottery, and that's the best I can get from my wetware. Anything further is like asking, "Which are you more certain is a European country, France or Spain?"
Also, I'm inclined to avoid questions of this form on principle. It's like Yudkowsky's "blue tentacle" in Technical Explanation: Being able to find outs for a theory that doesn't fit evidence is anti-knowledge, and the more practice you get at it the crazier you become.
I'm not quite sure what you mean by 'anthropomorphic' here. One way to think about framing the comparison is to note that if intelligent extraterrestrials have visited us, we have to update strongly in favor of their intelligence playing an important role in our intelligence. In any universe that isn't completely teeming with intelligent life, this will hold for anthropic reasons; two intelligences are immeasurably more likely to encounter each other if one had a causal role in the other's coming to existence (via panspermia and/or guided evolution). So so... (read more)