The Sin of Underconfidence
There are three great besetting sins of rationalists in particular, and the third of these is underconfidence. Michael Vassar regularly accuses me of this sin, which makes him unique among the entire population of the Earth.
But he's actually quite right to worry, and I worry too, and any adept rationalist will probably spend a fair amount of time worying about it. When subjects know about a bias or are warned about a bias, overcorrection is not unheard of as an experimental result. That's what makes a lot of cognitive subtasks so troublesome—you know you're biased but you're not sure how much, and you don't know if you're correcting enough—and so perhaps you ought to correct a little more, and then a little more, but is that enough? Or have you, perhaps, far overshot? Are you now perhaps worse off than if you hadn't tried any correction?
You contemplate the matter, feeling more and more lost, and the very task of estimation begins to feel increasingly futile...
And when it comes to the particular questions of confidence, overconfidence, and underconfidence—being interpreted now in the broader sense, not just calibrated confidence intervals—then there is a natural tendency to cast overconfidence as the sin of pride, out of that other list which never warned against the improper use of humility or the abuse of doubt. To place yourself too high—to overreach your proper place—to think too much of yourself—to put yourself forward—to put down your fellows by implicit comparison—and the consequences of humiliation and being cast down, perhaps publicly—are these not loathesome and fearsome things?
To be too modest—seems lighter by comparison; it wouldn't be so humiliating to be called on it publicly, indeed, finding out that you're better than you imagined might come as a warm surprise; and to put yourself down, and others implicitly above, has a positive tinge of niceness about it, it's the sort of thing that Gandalf would do.
So if you have learned a thousand ways that humans fall into error and read a hundred experimental results in which anonymous subjects are humiliated of their overconfidence—heck, even if you've just read a couple of dozen—and you don't know exactly how overconfident you are—then yes, you might genuinely be in danger of nudging yourself a step too far down.
Great Books of Failure
Followup to: Unteachable Excellence
As previously observed, extraordinary successes tend to be considered extraordinary precisely because it is hard to teach (relative to the then-current level of understanding and systematization). On the other hand, famous failures are much more likely to contain lessons on what to avoid next time.
Books about epic screwups have constituted some of my more enlightening reading. Do you have any such books to recommend?
Please break up multiple recommendations into multiple comments, one book per comment, so they can be voted on and discussed separately. And please say at least a little about the book's subject and what sort of lesson you learned from it.
Bayesians vs. Barbarians
Previously in series: Collective Apathy and the Internet
Followup to: Helpless Individuals
Let's say we have two groups of soldiers. In group 1, the privates are ignorant of tactics and strategy; only the sergeants know anything about tactics and only the officers know anything about strategy. In group 2, everyone at all levels knows all about tactics and strategy.
Should we expect group 1 to defeat group 2, because group 1 will follow orders, while everyone in group 2 comes up with better ideas than whatever orders they were given?
In this case I have to question how much group 2 really understands about military theory, because it is an elementary proposition that an uncoordinated mob gets slaughtered.
Suppose that a country of rationalists is attacked by a country of Evil Barbarians who know nothing of probability theory or decision theory.
Now there's a certain viewpoint on "rationality" or "rationalism" which would say something like this:
"Obviously, the rationalists will lose. The Barbarians believe in an afterlife where they'll be rewarded for courage; so they'll throw themselves into battle without hesitation or remorse. Thanks to their affective death spirals around their Cause and Great Leader Bob, their warriors will obey orders, and their citizens at home will produce enthusiastically and at full capacity for the war; anyone caught skimming or holding back will be burned at the stake in accordance with Barbarian tradition. They'll believe in each other's goodness and hate the enemy more strongly than any sane person would, binding themselves into a tight group. Meanwhile, the rationalists will realize that there's no conceivable reward to be had from dying in battle; they'll wish that others would fight, but not want to fight themselves. Even if they can find soldiers, their civilians won't be as cooperative: So long as any one sausage almost certainly doesn't lead to the collapse of the war effort, they'll want to keep that sausage for themselves, and so not contribute as much as they could. No matter how refined, elegant, civilized, productive, and nonviolent their culture was to start with, they won't be able to resist the Barbarian invasion; sane discussion is no match for a frothing lunatic armed with a gun. In the end, the Barbarians will win because they want to fight, they want to hurt the rationalists, they want to conquer and their whole society is united around conquest; they care about that more than any sane person would."
Collective Apathy and the Internet
Previously in series: Beware of Other-Optimizing
Followup to: Bystander Apathy
Yesterday I convered the bystander effect, aka bystander apathy: given a fixed problem situation, a group of bystanders is actually less likely to act than a single bystander. The standard explanation for this result is in terms of pluralistic ignorance (if it's not clear whether the situation is an emergency, each person tries to look calm while darting their eyes at the other bystanders, and sees other people looking calm) and diffusion of responsibility (everyone hopes that someone else will be first to act; being part of a crowd diminishes the individual pressure to the point where no one acts).
Which may be a symptom of our hunter-gatherer coordination mechanisms being defeated by modern conditions. You didn't usually form task-forces with strangers back in the ancestral environment; it was mostly people you knew. And in fact, when all the subjects know each other, the bystander effect diminishes.
So I know this is an amazing and revolutionary observation, and I hope that I don't kill any readers outright from shock by saying this: but people seem to have a hard time reacting constructively to problems encountered over the Internet.
Perhaps because our innate coordination instincts are not tuned for:
- Being part of a group of strangers. (When all subjects know each other, the bystander effect diminishes.)
- Being part of a group of unknown size, of strangers of unknown identity.
- Not being in physical contact (or visual contact); not being able to exchange meaningful glances.
- Not communicating in real time.
- Not being much beholden to each other for other forms of help; not being codependent on the group you're in.
- Being shielded from reputational damage, or the fear of reputational damage, by your own apparent anonymity; no one is visibly looking at you, before whom your reputation might suffer from inaction.
- Being part of a large collective of other inactives; no one will single out you to blame.
- Not hearing a voiced plea for help.
Bystander Apathy
The bystander effect, also known as bystander apathy, is that larger groups are less likely to act in emergencies - not just individually, but collectively. Put an experimental subject alone in a room and let smoke start coming up from under the door. 75% of the subjects will leave to report it. Now put three subjects in the room - real subjects, none of whom know what's going on. On only 38% of the occasions will anyone report the smoke. Put the subject with two confederates who ignore the smoke, and they'll only report it 10% on the time - even staying in the room until it becomes hazy. (Latane and Darley 1969.)
On the standard model, the two primary drivers of bystander apathy are:
- Diffusion of responsibility - everyone hopes that someone else will be first to step up and incur any costs of acting. When no one does act, being part of a crowd provides an excuse and reduces the chance of being held personally responsible for the results.
- Pluralistic ignorance - people try to appear calm while looking for cues, and see... that the others appear calm.
Cialdini (2001):
Very often an emergency is not obviously an emergency. Is the man lying in the alley a heart-attack victim or a drunk sleeping one off? ... In times of such uncertainty, the natural tendency is to look around at the actions of others for clues. We can learn from the way the other witnesses are reacting whether the event is or is not an emergency. What is easy to forget, though, is that everybody else observing the event is likely to be looking for social evidence, too. Because we all prefer to appear poised and unflustered among others, we are likely to search for that evidence placidly, with brief, camouflaged glances at those around us. Therefore everyone is likely to see everyone else looking unruffled and failing to act.
Cialdini suggests that if you're ever in emergency need of help, you point to one single bystander and ask them for help - making it very clear to whom you're referring. Remember that the total group, combined, may have less chance of helping than one individual.
How Much Thought
We have many built in heuristics, and most of them are trouble. The absurdity heuristic makes us reject reasonable things out of hand, so we should take the time to fully understand things that seem absurd at first. Some of our beliefs are not reasoned, but inherited; we should sniff those out and discard them. We repeat cached thoughts, so we should clear and rethink them. The affect heuristic is a tricky one; to work around it, we have to take the outside view. Everything we see and do primes us, so for really important decisions, we should never leave our rooms. We fail to attribute agency to things which should have it, like opinions, so if less drastic means don't work, we should modify English to make ourseves do so.
All of these articles bear the same message, the same message that can be easily found in the subtext of every book, treatise and example of rationality. Think more. Look for the third alternative. Challenge your deeply held beliefs. Drive through semantic stop signs. Prepare a line of retreat. If you don't understand, you should make an extraordinary effort. When you do find cause to change your beliefs, complete a checklist, run a script and follow a ritual. Recheck your answers, because thinking helps; more thought is always better.
The problem is, there's only a limited amount of time in each day. To spend more time thinking about something, we must spend less time on something else. The more we think about each topic, the fewer topics we have time to think about at all. Rationalism gives us a long list of extra things to think about, and angles to think about them from, without guidance on where or how much to apply them. This can make us overthink some things and disastrously underthink others. Our worst mistakes are not those where our thoughts went astray, but those we failed to think about at all. The time between when we learn rationality techniques and when we learn where to apply them is the valley.
Sunk Cost Fallacy
Related to: Just Lose Hope Already, The Allais Paradox, Cached Selves
In economics we have this concept of sunk costs, referring to costs that have already been incurred, but which cannot be recouped. Sunk cost fallacy refers to the fallacy of honoring sunk costs, which decision-theoretically should just be ignored. The canonical example goes something like this: you have purchased a nonrefundable movie ticket in advance. (For the nitpickers in the audience, I will also specify that the ticket is nontransferable and that you weren't planning on meeting anyone.) When the night of the show comes, you notice that you don't actually feel like going out, and would actually enjoy yourself more at home. Do you go to the movie anyway?
A lot of people say yes, to avoid wasting the ticket. But on further consideration, it would seem that these people are simply getting it wrong. The ticket is a sunk cost: it's already paid for, and you can't do anything with it but go to the movie. But we've stipulated that you don't want to go to the movie. The theater owners don't care whether you go; they already have their money. The other theater-goers, insofar as they can be said to have a preference, would actually rather you stayed home, making the theater marginally less crowded. If you go to the movie to satisfy your intuition about not wasting the ticket, you're not actually helping anyone. Of course, you're entitled to your values, if not your belief. If you really do place terminal value on using something because you've paid for it, well, fine, I guess. But we should all try to notice exactly what it is we're doing, in case it turns out to not be what we want. Please, think it through.
Dearest reader, if you're now about to scrap your intuition against wasting things, I implore you: don't! The moral of the parable of the movie ticket is not that waste is okay; it's that you should implement your waste-reduction interventions at a time when they can actually help. If you can anticipate your enthusiasm waning on the night of the show, don't purchase the nonrefundable ticket in the first place!
It's okay to be (at least a little) irrational
Caused by: Purchase Fuzzies and Utilons Separately
As most readers will know by now, if you're donating to a charity, it doesn't make sense to spread your donations across several charities (assuming you're primarily trying to maximize the amount of good done). You'll want to pick the charity where your money does the most good, and then donate as much as possible to that one. Most readers will also be aware that this isn't intuitive to most people - many will instinctively try to spread their money across several different causes.
I'm spending part of my income on charity, too. Admittedly, this isn't much - 30 USD each month - but then neither is my income as a student. Previously I had been spreading that sum to three different charities, each of them getting an equal amount. On at least two different venues, people had (not always knowingly) tried to talk me out of it, and I did feel that their arguments were pretty strong. Still, I didn't change my ways, even though there was mental pressure building up, trying to push me in that direction. There were actually even some other charities I was considering also donating to, even though I knew I probably shouldn't.
Then I read Eliezer's Purchase Fuzzies and Utilons Separately. Here was a post saying, in essence, that it's okay to spend some of your money in what amounted to an irrational way. Yes, go ahead and spread your money, and go ahead and use some of it just to purchase warm fuzzies. You're just human, after all. Just try to make sure you still donate more to a utilon maximizer than to purchasing the fuzzies.
Here I was, with a post that allowed me to stop rationalizing reasons for why spreading money was good, and instead spread them because I was honestly selfish and just buying a good feeling. Now, I didn't need to worry about being irrational in having diversified donations. So since it was okay, I logged in to PayPal, cancelled the two monthly donations I had going to the other organizations, and tripled the amount of money that I was giving to the Institute Which Shall Not Be Named.
Not exactly the outcome one might have suspected.
The uniquely awful example of theism
When an LW contributor is in need of an example of something that (1) is plainly, uncontroversially (here on LW, at least) very wrong but (2) an otherwise reasonable person might get lured into believing by dint of inadequate epistemic hygiene, there seems to be only one example that everyone reaches for: belief in God. (Of course there are different sorts of god-belief, but I don't think that makes it count as more than one example.) Eliezer is particularly fond of this trope, but he's not alone.
How odd that there should be exactly one example. How convenient that there is one at all! How strange that there isn't more than one!
In the population at large (even the smarter parts of it) god-belief is sufficiently widespread that using it as a canonical example of irrationality would run the risk of annoying enough of your audience to be counterproductive. Not here, apparently. Perhaps we-here-on-LW are just better reasoners than everyone else ... but then, again, isn't it strange that there aren't a bunch of other popular beliefs that we've all seen through? In the realm of politics or economics, for instance, surely there ought to be some.
Also: it doesn't seem to me that I'm that a much better thinker than I was a few years ago when (alas) I was a theist; nor does it seem to me that everyone on LW is substantially better at thinking than I am; which makes it hard for me to believe that there's a certain level of rationality that almost everyone here has attained, and that makes theism vanishingly rare.
I offer the following uncomfortable conjecture: We all want to find (and advertise) things that our superior rationality has freed us from, or kept us free from. (Because the idea that Rationality Just Isn't That Great is disagreeable when one has invested time and/or effort and/or identity in rationality, and because we want to look impressive.) We observe our own atheism, and that everyone else here seems to be an atheist too, and not unnaturally we conclude that we've found such a thing. But in fact (I conjecture) LW is so full of atheists not only because atheism is more rational than theism (note for the avoidance of doubt: yes, I agree that atheism is more rational than theism, at least for people in our epistemic situation) but also because
Extreme Rationality: It's Not That Great
Related to: Individual Rationality is a Matter of Life and Death, The Benefits of Rationality, Rationality is Systematized Winning
But I finally snapped after reading: Mandatory Secret Identities
Okay, the title was for shock value. Rationality is pretty great. Just not quite as great as everyone here seems to think it is.
For this post, I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training." It seems pretty uncontroversial that there are massive benefits from going from a completely irrational moron to the average intelligent person's level. I'm coining this new term so there's no temptation to confuse x-rationality with normal, lower-level rationality.
And for this post, I use "benefits" or "practical benefits" to mean anything not relating to philosophy, truth, winning debates, or a sense of personal satisfaction from understanding things better. Money, status, popularity, and scientific discovery all count.
So, what are these "benefits" of "x-rationality"?
A while back, Vladimir Nesov asked exactly that, and made a thread for people to list all of the positive effects x-rationality had on their lives. Only a handful responded, and most responses weren't very practical. Anna Salamon, one of the few people to give a really impressive list of benefits, wrote:
I'm surprised there are so few apparent gains listed. Are most people who benefited just being silent? We should expect a certain number of headache-cures, etc., just by placebo effects or coincidences of timing.
There have since been a few more people claiming practical benefits from x-rationality, but we should generally expect more people to claim benefits than to actually experience them. Anna mentions the placebo effect, and to that I would add cognitive dissonance - people spent all this time learning x-rationality, so it MUST have helped them! - and the same sort of confirmation bias that makes Christians swear that their prayers really work.
I find my personal experience in accord with the evidence from Vladimir's thread. I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines1, I can't think of any.
Looking over history, I do not find any tendency for successful people to have made a formal study of x-rationality. This isn't entirely fair, because the discipline has expanded vastly over the past fifty years, but the basics - syllogisms, fallacies, and the like - have been around much longer. The few groups who made a concerted effort to study x-rationality didn't shoot off an unusual number of geniuses - the Korzybskians are a good example. In fact as far as I know the only follower of Korzybski to turn his ideas into a vast personal empire of fame and fortune was (ironically!) L. Ron Hubbard, who took the basic concept of techniques to purge confusions from the mind, replaced the substance with a bunch of attractive flim-flam, and founded Scientology. And like Hubbard's superstar followers, many of this century's most successful people have been notably irrational.
There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it. The evidence in favor of the proposition right now seems to be its sheer obviousness. Rationality is the study of knowing the truth and making good decisions. How the heck could knowing more than everyone else and making better decisions than them not make you more successful?!?
This is a difficult question, but I think it has an answer. A complex, multifactorial answer, but an answer.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)