Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Baboons... literally have been the textbook example of a highly aggressive, male-dominated, hierarchical society. Because these animals hunt, because they live in these aggressive troupes on the Savannah... they have a constant baseline level of aggression which inevitably spills over into their social lives.
Scientists have never observed a baboon troupe that wasn't highly aggressive, and they have compelling reasons to think this is simply baboon nature, written into their genes. Inescapable.
Or at least, that was true until the 1980s, when Kenya experienced a tourism boom.
Sapolsky was a grad student, studying his first baboon troupe. A new tourist lodge was built at the edge of the forest where his baboons lived. The owners of the lodge dug a hole behind the lodge and dumped their trash there every morning, after which the males of several baboon troupes — including Sapolsky's — would fight over this pungent bounty.
Before too long, someone noticed the baboons didn't look too good. It turned out they had eaten some infected meat and developed tuberculosis, which kills baboons in weeks. Their hands rotted away, so they hobbled around on their elbows. Half the males in Sapolsky's troupe died.
This had a surprising effect. There was now almost no violence in the troupe. Males often reciprocated when females groomed them, and males even groomed other males. To a baboonologist, this was like watching Mike Tyson suddenly stop swinging in a heavyweight fight to start nuzzling Evander Holyfield. It never happened.
Razib summarized my entire cognitive biases talk at the Singularity Summit 2009 as saying: "Most people are stupid."
Hey! That's a bit unfair. I never said during my talk that most people are stupid. In fact, I was very careful not to say, at any point, that people are stupid, because that's explicitly not what I believe.
And in the closing sentence of my talk on cognitive biases and existential risk, I did not say that humanity was devoting more resources to football than existential risk prevention because we were stupid.
There's an old joke that runs as follows:
A motorist is driving past a mental hospital when he gets a flat tire.
He goes out to change the tire, and sees that one of the patients is watching him through the fence.
Nervous, trying to work quickly, he jacks up the car, takes off the wheel, puts the lugnuts into the hubcap -
And steps on the hubcap, sending the lugnuts clattering into a storm drain.
The mental patient is still watching him through the fence.
The motorist desperately looks into the storm drain, but the lugnuts are gone.
The patient is still watching.
The motorist paces back and forth, trying to think of what to do -
And the patient says,
"Take one lugnut off each of the other tires, and you'll have three lugnuts on each."
"That's brilliant!" says the motorist. "What's someone like you doing in an asylum?"
"I'm here because I'm crazy," says the patient, "not because I'm stupid."
Previously in series: Epistemic Viciousness
Robyn Dawes, author of one of the original papers from Judgment Under Uncertainty and of the book Rational Choice in an Uncertain World—one of the few who tries really hard to import the results to real life—is also the author of House of Cards: Psychology and Psychotherapy Built on Myth.
From House of Cards, chapter 1:
The ability of these professionals has been subjected to empirical scrutiny—for example, their effectiveness as therapists (Chapter 2), their insight about people (Chapter 3), and the relationship between how well they function and the amount of experience they have had in their field (Chapter 4). Virtually all the research—and this book will reference more than three hundred empirical investigations and summaries of investigations—has found that these professionals' claims to superior intuitive insight, understanding, and skill as therapists are simply invalid...
Remember Rorschach ink-blot tests? It's such an appealing argument: the patient looks at the ink-blot and says what he sees, the psychotherapist interprets their psychological state based on this. There've been hundreds of experiments looking for some evidence that it actually works. Since you're reading this, you can guess the answer is simply "No." Yet the Rorschach is still in use. It's just such a good story that psychotherapists just can't bring themselves to believe the vast mounds of experimental evidence saying it doesn't work—
—which tells you what sort of field we're dealing with here.
And the experimental results on the field as a whole are commensurate. Yes, patients who see psychotherapists have been known to get better faster than patients who simply do nothing. But there is no statistically discernible difference between the many schools of psychotherapy. There is no discernible gain from years of expertise.
And there's also no discernible difference between seeing a psychotherapist and spending the same amount of time talking to a randomly selected college professor from another field. It's just talking to anyone that helps you get better, apparently.
In the entire absence of the slightest experimental evidence for their effectiveness, psychotherapists became licensed by states, their testimony accepted in court, their teaching schools accredited, and their bills paid by health insurance.
And there was also a huge proliferation of "schools", of traditions of practice, in psychotherapy; despite—or perhaps because of—the lack of any experiments showing that one school was better than another...
To teach people about a topic you've labeled "rationality", it helps for them to be interested in "rationality". (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)
And when people explain why they're not interested in rationality, one of the most commonly proffered reasons tends to be like: "Oh, I've known a couple of rational people and they didn't seem any happier."
That's really not a whole lot of rationality, as I have previously said.
Even if you limit yourself to people who can derive Bayes's Theorem—which is going to eliminate, what, 98% of the above personnel?—that's still not a whole lot of rationality. I mean, it's a pretty basic theorem.
Since the beginning I've had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.
But when I look around me in the real world, I don't see that. Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, Tooby & Cosmides. A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it's not cool to care that much. I can see that they've found a rhythm, a unity that begins to pervade their arguments—
Yet even that... isn't really a whole lot of rationality either.
To paraphrase the Black Belt Bayesian: Behind every exciting, dramatic failure, there is a more important story about a larger and less dramatic failure that made the first failure possible.
If every trace of religion was magically eliminated from the world tomorrow, then—however much improved the lives of many people would be—we would not even have come close to solving the larger failures of sanity that made religion possible in the first place.
We have good cause to spend some of our efforts on trying to eliminate religion directly, because it is a direct problem. But religion also serves the function of an asphyxiated canary in a coal mine—religion is a sign, a symptom, of larger problems that don't go away just because someone loses their religion.
Consider this thought experiment—what could you teach people that is not directly about religion, which is true and useful as a general method of rationality, which would cause them to lose their religions? In fact—imagine that we're going to go and survey all your students five years later, and see how many of them have lost their religions compared to a control group; if you make the slightest move at fighting religion directly, you will invalidate the experiment. You may not make a single mention of religion or any religious belief in your classroom, you may not even hint at it in any obvious way. All your examples must center about real-world cases that have nothing to do with religion.
If you can't fight religion directly, what do you teach that raises the general waterline of sanity to the point that religion goes underwater?