"No," you say, "I'm talking about how startup founders strike it rich by believing in themselves and their ideas more strongly than any reasonable person would. ..."
It's important to realize that this is another myth perpetuated by the media and our ignorance of the statistics. Most startups fail; I think the statistics are that 80% die in the first 5 years. But the ones that get written up in glowing articles are the ones that succeeded. Of course all those founders who struck it rich believed strongly in their ideas, but so did many of those that failed. That irrational belief may be a crucial ingredient for success, but it doesn't supply a guarantee. Most of the people who held that irrational belief worked for businesses that failed--but they didn't get their name in the paper, so they're relatively invisible.
Still, if everyone who does succeed has an irrational belief in their own success, then it's not wrong to conclude that such a belief is probably a prerequisite (though certainly not a "guarantee") for success.
Some historical context:
16th through 19th-century rationalists advocated views something like the views Eliezer is advocating. This view was eventually reflected in the art of the day, as exemplified by Bach and, later, by the strict formalisms of classical music.
In the 19th century, romanticism was an artistic reaction against rationalism. We're talking Goethe, Beethoven, Byron, and Blake. In painting, it was also a reaction against photography, searching for a justification for continuing to paint.
During the romantic period, Nietzsche used romantic artistic ideas to criticize rationality, by saying that life is worth living when we commit to values, and rationality undermines our commitment to our values. He offered as an alternative the culture/value creator, who leads his culture to greatness. This greatness, he says, can only be attained if we reject rationalism. There is some happiness theory in there as well, including the idea that war isn't justified by values, war justifies values. This seems to be a riff on the idea that the striving and drama is itself what we value.
In the 20th century, Max Weber rephrased it this way: Societies are legitimized by tradition, rea...
I'd like to see it as its own post, illustrated with quotes from Nietzsche or quotes from those interpreting Nietzsche.
Oh no, more grandeur.
A rationalist can take a small concrete problem, reduce it to essentials, figure out a good strategy and follow it. No need to brainf*ck yourself and reevaluate your whole life - people have built bridges and discovered physical laws without it. For examples of what I want see Thomas Schelling's "Strategy of Conflict": no mystique, just clear mathematical analysis of many real-life problems. Starts out from toys, e.g. bargaining games and PD, and culminates in lots of useful tactics for nuclear deterrence that were actually adopted by the US military after the book's publication. How's that for "something to protect"?
I for one would be happy if you just wrote up, mathematically, your solution concept for Newcomb's and PD. Is it an extension of superrationality for asymmetric games, or something else entirely? If we slowly modify one player's payoffs in PD, at what precise moment do you stop cooperating?
When there is a conventional wisdom it usually pays for most people to become more rational just so they can better anticipate, assimilate, remember and use that conventional wisdom. But once your rationality becomes so strong that it leads you to often reject conventional wisdom, then you face a tougher tradeoff; there can be serious social costs from rejecting conventional wisdom.
Things are actually a bit worse than this, because there is also no theorem that says there is only one valley, so there's no guarantee that even after you climb out of this valley, your next step won't cause you to go off a precipice.
BTW, there's a very similar issue in economics, which goes under the name of the Theory of the Second Best. Markets will allocate resources efficiently if they are perfectly competitive and complete, but there is no guarantee that any incremental progress towards that state, such creating some markets that were previously missing, or making some markets more competitive, will improve social welfare.
An incremental step can be a loss where you have two errors reversing each other. You have error A that causes suffering a and error B that causes anti-a. You cure B, and suddenly you experience a. The anti-rationalist says "quick, reinstate B". I say "no, work back from a to A and cure A".
Example: pessimists make better calibrated estimates but are worse off for happiness and health. IMO the pessimists are probably not accepting the reality they predict, they are railing against it, which is a variety of magical thinking.
Even perfectly rational agents can lose. They just can't know in advance that they'll lose. They can't expect to underperform any other performable strategy, or they would simply perform it.
I think your formulation in this post is the clearest, and I agree with it. In previous posts, you may have said things which confused your point, such as this:
Said I: "If you fail to achieve a correct answer, it is futile to protest that you acted with propriety."
The strong interpretation of this quote is that if you lose, you weren't being rational...
But if you don't care about the truth - and you have nothing to protect - and you're not attracted to the thought of pushing your art as far as it can go - and your current life seems to be going fine - and you have a sense that your mental well-being depends on illusions you'd rather not think about -
..then it may already be too late, since the seed of doubt is already planted.
Most people are not signed up for cryonics, so if you postulate that the "benefit" to an individual of cryonics is massive compared to the increment in quality of life that being irrationally comforted brings, then almost everyone ought to be epistemically rational.
I don't know what's up with the italics here. It doesn't show like that in the editor or in the raw HTML. Copying to another application and repasting doesn't fix it, etc.
Rationality does not guarantee results at the single human scale.
Making a decision that is statistically correct only works out in the long run, over a number of such decisions.
You can make a decision that was the correct decision given the information you had, and then it doesn't work out.
This is an experiment with quoted text. Now is the time for all good men to come to the aid of their country, don't ya think?
This is really important.
From a statistical standpoint, lottery winners don't exist - you would never encounter one in your lifetime, if it weren't for the selective reporting.
Well... one of my grandmothers' neighbors, whose son I played with as a child, did indeed win the lottery. (AFAIK, it was a relatively modest jackpot, but he did win!)
Also, re: cryonics: My current understanding is that being an organ donor is incompatible with cryonic preservation. Is this correct? (Myself, I think I'd rather be an organ donor...)
IAWYC, but am confused by the phrase
If you don't prefer truth to happiness with false beliefs...
Does it make sense to talk about preferring something over happiness? I know what you mean if we take a folk definition of happiness as something like "bubbly feelings". But I don't think you mean folk happiness; for this statement to have impact, it has to mean Happiness, defined to include all of your values.
I think what I'm trying to ask is: Isn't it by definition irrational (failing to maximize your happiness) to prefer truth to happiness?
And only now I finally get why some of the people I know kept telling me, again and again, "okay, but rationality is not enough for everyone to get through their lives, people need something to believe in..." they were just picturing the step of being "realistic".
It has dawned on me that nearly all the illusions I was wrapped in were making my life considerably unhappier.
I guess that's why I've never experienced anything close as finding myself worse off because of studying rationality, not even after the first steps.
Eliezer said: "Even the surveys are comparing the average religious person to the average atheist, not the most advanced theologians to the most advanced rationalists."
Very true. Wouldn't it be a kicker if that was done and we found out that the most advanced theologians ARE the most advanced rationalists? I suspect the chances of something like this being true are higher than most of us think.
I think that it is very important to look at how much work the commenter put into their comment.
One thing that kills discussion boards is that the conversations become too cliched. Mr. A makes the standard comment. Mr. B make the standard rebuttal. Mr. A makes the standard defence. Mr. B makes the traditional follow up.
When Mr. A makes the standard comment, is that for real, or is it just trolling? Tough question. I think that there comes a point at which one has to get tough and do drive-by downvoting on valid, on topic comments, because they are common place and threaten to destroy the discussion by making it too familiar, swamping the discussion with the banal.
The other side to this it if Mr. A makes a three paragraph comment. 1)His point. 2)The standard rebuttal. 3)Why he thinks his points survives the standard rebuttal. At this point we know that Mr. A is not a troll. He has put in too much work to count coup on getting a bite. He is making a effort to move the discussion on briskly so that it can reach unbroken ground. He has earned an explanation of why his comment is crap, and I would say that he has earned the right to an actual typed in criticism instead of a down vote....
Yesterday I said: "Rationality is systematized winning"
"But," you protest, "the reasonable person doesn't always win!"
What do you mean by this? Do you mean that every week or two, someone who bought a lottery ticket with negative expected value, wins the lottery and becomes much richer than you? That is not a systematic loss; it is selective reporting by the media. From a statistical standpoint, lottery winners don't exist—you would never encounter one in your lifetime, if it weren't for the selective reporting.
Even perfectly rational agents can lose. They just can't know in advance that they'll lose. They can't expect to underperform any other performable strategy, or they would simply perform it.
"No," you say, "I'm talking about how startup founders strike it rich by believing in themselves and their ideas more strongly than any reasonable person would. I'm talking about how religious people are happier—"
Ah. Well, here's the the thing: An incremental step in the direction of rationality, if the result is still irrational in other ways, does not have to yield incrementally more winning.
The optimality theorems that we have for probability theory and decision theory, are for perfect probability theory and decision theory. There is no companion theorem which says that, starting from some flawed initial form, every incremental modification of the algorithm that takes the structure closer to the ideal, must yield an incremental improvement in performance. This has not yet been proven, because it is not, in fact, true.
"So," you say, "what point is there then in striving to be more rational? We won't reach the perfect ideal. So we have no guarantee that our steps forward are helping."
You have no guarantee that a step backward will help you win, either. Guarantees don't exist in the world of flesh; but contrary to popular misconceptions, judgment under uncertainty is what rationality is all about.
"But we have several cases where, based on either vaguely plausible-sounding reasoning, or survey data, it looks like an incremental step forward in rationality is going to make us worse off. If it's really all about winning—if you have something to protect more important than any ritual of cognition—then why take that step?"
Ah, and now we come to the meat of it.
I can't necessarily answer for everyone, but...
My first reason is that, on a professional basis, I deal with deeply confused problems that make huge demands on precision of thought. One small mistake can lead you astray for years, and there are worse penalties waiting in the wings. An unimproved level of performance isn't enough; my choice is to try to do better, or give up and go home.
"But that's just you. Not all of us lead that kind of life. What if you're just trying some ordinary human task like an Internet startup?"
My second reason is that I am trying to push some aspects of my art further than I have seen done. I don't know where these improvements lead. The loss of failing to take a step forward is not that one step, it is all the other steps forward you could have taken, beyond that point. Robin Hanson has a saying: The problem with slipping on the stairs is not falling the height of the first step, it is that falling one step leads to falling another step. In the same way, refusing to climb one step up forfeits not the height of that step but the height of the staircase.
"But again—that's just you. Not all of us are trying to push the art into uncharted territory."
My third reason is that once I realize I have been deceived, I can't just shut my eyes and pretend I haven't seen it. I have already taken that step forward; what use to deny it to myself? I couldn't believe in God if I tried, any more than I could believe the sky above me was green while looking straight at it. If you know everything you need to know in order to know that you are better off deceiving yourself, it's much too late to deceive yourself.
"But that realization is unusual; other people have an easier time of doublethink because they don't realize it's impossible. You go around trying to actively sponsor the collapse of doublethink. You, from a higher vantage point, may know enough to expect that this will make them unhappier. So is this out of a sadistic desire to hurt your readers, or what?"
Then I finally reply that my experience so far—even in this realm of merely human possibility—does seem to indicate that, once you sort yourself out a bit and you aren't doing quite so many other things wrong, striving for more rationality actually will make you better off. The long road leads out of the valley and higher than before, even in the human lands.
The more I know about some particular facet of the Art, the more I can see this is so. As I've previously remarked, my essays may be unreflective of what a true martial art of rationality would be like, because I have only focused on answering confusing questions—not fighting akrasia, coordinating groups, or being happy. In the field of answering confusing questions—the area where I have most intensely practiced the Art—it now seems massively obvious that anyone who thought they were better off "staying optimistic about solving the problem" would get stomped into the ground. By a casual student.
When it comes to keeping motivated, or being happy, I can't guarantee that someone who loses their illusions will be better off—because my knowledge of these facets of rationality is still crude. If these parts of the Art have been developed systematically, I do not know of it. But even here I have gone to some considerable pains to dispel half-rational half-mistaken ideas that could get in a beginner's way, like the idea that rationality opposes feeling, or the idea that rationality opposes value, or the idea that sophisticated thinkers should be angsty and cynical.
And if, as I hope, someone goes on to develop the art of fighting akrasia or achieving mental well-being as thoroughly as I have developed the art of answering impossible questions, I do fully expect that those who wrap themselves in their illusions will not begin to compete. Meanwhile—others may do better than I, if happiness is their dearest desire, for I myself have invested little effort here.
I find it hard to believe that the optimally motivated individual, the strongest entrepreneur a human being can become, is still wrapped up in a blanket of comforting overconfidence. I think they've probably thrown that blanket out the window and organized their mind a little differently. I find it hard to believe that the happiest we can possibly live, even in the realms of human possibility, involves a tiny awareness lurking in the corner of your mind that it's all a lie. I'd rather stake my hopes on neurofeedback or Zen meditation, though I've tried neither.
But it cannot be denied that this is a very real issue in very real life. Consider this pair of comments from Less Wrong:
And:
So—in practice, in real life, in sober fact—those first steps can, in fact, be painful. And then things can, in fact, get better. And there is, in fact, no guarantee that you'll end up higher than before. Even if in principle the path must go further, there is no guarantee that any given person will get that far.
If you don't prefer truth to happiness with false beliefs...
Well... and if you are not doing anything especially precarious or confusing... and if you are not buying lottery tickets... and if you're already signed up for cryonics, a sudden ultra-high-stakes confusing acid test of rationality that illustrates the Black Swan quality of trying to bet on ignorance in ignorance...
Then it's not guaranteed that taking all the incremental steps toward rationality that you can find, will leave you better off. But the vaguely plausible-sounding arguments against losing your illusions, generally do consider just one single step, without postulating any further steps, without suggesting any attempt to regain everything that was lost and go it one better. Even the surveys are comparing the average religious person to the average atheist, not the most advanced theologians to the most advanced rationalists.
But if you don't care about the truth—and you have nothing to protect—and you're not attracted to the thought of pushing your art as far as it can go—and your current life seems to be going fine—and you have a sense that your mental well-being depends on illusions you'd rather not think about—
Then you're probably not reading this. But if you are, then, I guess... well... (a) sign up for cryonics, and then (b) stop reading Less Wrong before your illusions collapse! RUN AWAY!