I just finished reading a history of Enron’s downfall, The Smartest Guys in the Room, which hereby wins my award for “Least Appropriate Book Title.”
An unsurprising feature of Enron’s slow rot and abrupt collapse was that the executive players never admitted to having made a large mistake. When catastrophe #247 grew to such an extent that it required an actual policy change, they would say, “Too bad that didn’t work out—it was such a good idea—how are we going to hide the problem on our balance sheet?” As opposed to, “It now seems obvious in retrospect that it was a mistake from the beginning.” As opposed to, “I’ve been stupid.” There was never a watershed moment, a moment of humbling realization, of acknowledging a fundamental problem. After the bankruptcy, Jeff Skilling, the former COO and brief CEO of Enron, declined his own lawyers’ advice to take the Fifth Amendment; he testified before Congress that Enron had been a great company.
Not every change is an improvement, but every improvement is necessarily a change. If we only admit small local errors, we will only make small local changes. The motivation for a big change comes from acknowledging a big mistake.
As a child I was raised on equal parts science and science fiction, and from Heinlein to Feynman I learned the tropes of Traditional Rationality: theories must be bold and expose themselves to falsification; be willing to commit the heroic sacrifice of giving up your own ideas when confronted with contrary evidence; play nice in your arguments; try not to deceive yourself; and other fuzzy verbalisms.
A traditional rationalist upbringing tries to produce arguers who will concede to contrary evidence eventually—there should be some mountain of evidence sufficient to move you. This is not trivial; it distinguishes science from religion. But there is less focus on speed, on giving up the fight as quickly as possible, integrating evidence efficiently so that it only takes a minimum of contrary evidence to destroy your cherished belief.
I was raised in Traditional Rationality, and thought myself quite the rationalist. I switched to Bayescraft (Laplace / Jaynes / Tversky / Kahneman) in the aftermath of . . . well, it’s a long story. Roughly, I switched because I realized that Traditional Rationality’s fuzzy verbal tropes had been insufficient to prevent me from making a large mistake.
After I had finally and fully admitted my mistake, I looked back upon the path that had led me to my Awful Realization. And I saw that I had made a series of small concessions, minimal concessions, grudgingly conceding each millimeter of ground, realizing as little as possible of my mistake on each occasion, admitting failure only in small tolerable nibbles. I could have moved so much faster, I realized, if I had simply screamed “OOPS!”
And I thought: I must raise the level of my game.
There is a powerful advantage to admitting you have made a large mistake. It’s painful. It can also change your whole life.
It is important to have the watershed moment, the moment of humbling realization. To acknowledge a fundamental problem, not divide it into palatable bite-size mistakes.
Do not indulge in drama and become proud of admitting errors. It is surely superior to get it right the first time. But if you do make an error, better by far to see it all at once. Even hedonically, it is better to take one large loss than many small ones. The alternative is stretching out the battle with yourself over years. The alternative is Enron.
Since then I have watched others making their own series of minimal concessions, grudgingly conceding each millimeter of ground; never confessing a global mistake where a local one will do; always learning as little as possible from each error. What they could fix in one fell swoop voluntarily, they transform into tiny local patches they must be argued into. Never do they say, after confessing one mistake, I’ve been a fool. They do their best to minimize their embarrassment by saying I was right in principle, or It could have worked, or I still want to embrace the true essence of whatever-I’m-attached-to. Defending their pride in this passing moment, they ensure they will again make the same mistake, and again need to defend their pride.
Better to swallow the entire bitter pill in one terrible gulp.
And then there are the legions of people who do not admit to even the tiniest mistake. To these people, incongruent information is to be ignored at all costs. And I do mean all costs: when my unvaccinated uncle died of Covid, my unvaccinated dad did not consider this to be evidence that Covid was dangerous, because my uncle also showed signs of having had a stroke around the same time, and we can be 100% certain this was the sole reason he was put on a ventilator and died. (Of course, this is not how he phrased it; he seems to have an extreme self-blinding technique, such that if a stroke could have killed his brother, there is nothing more to say or think about the matter and We Will Not Discuss It Further.) It did not sway him, either, when his favorite anti-vax pastor Marcus Lamb died of Covid, though he had no other cause of death to propose.
I think this type of person is among the most popular and extreme in politics. And their followers, such as my dad, do the same thing.
But they never admit it. They may even use the language of changing their mind: "I was wrong... it turns out the conspiracy is even bigger than I thought!" And I think a lot of people who can change their mind get roped in by those who can't. Myself, for instance: my religion taught me it was important to tell the truth, but eventually I found out that key information was hidden from me, filtered out by leaders who taught "tell the truth" and "choose the right". The hypocrisy was not obvious, and it took me far too long to detect it.
I'm so glad there's a corner of the internet for people who can change their minds quicker than scientists, even if the information comes from the "wrong" side. Like when a climate science denier told me CO2's effect decreases logarithmically, and within a day or two I figured out he was right. Some more recent flip-flops of mine: Covid origin (natural origin => likely lab leak => natural origin); Russia's invasion of Ukraine (Kyiv will fall => Russia's losing => stalemate).
But it's not enough; we need to scale rationality up. Eliezer mainly preached individual rationality, with "rationality dojos" and such, but figuring out the truth is very hard in a media environment where nearly two thirds of everybody gives up each centimetre of ground grudgingly, and the other third won't give up even a single millimetre of ground (at least not until the rest of the tribe has given up a few metres first). And maybe it's worse, maybe it's half-and-half. In this environment it's often a lot of work even for aspiring rationalists to figure out a poor approximation of the truth. I think we can do better and I've been wanting to propose a technological solution, but after seven months no one has upvoted or even tried to criticize my idea.
Not knocking your idea, but usually when you want to complain that "no one has upvoted me" it's good to think again whether you really want to blame other people.
I can guess at a reason why people may not have read that post you linked. I found it long-winded, like a page out of your diary where you're still developing the idea, thinking aloud by writing -- which is excellent to do, but it doesn't seem like something you wrote from the start for other people to read, so it's hard to follow. At least, I'm still puzzled about what you wanted to put forward in it.