I've never really understood the "rationalists don't win" sentiment. The people I've met who have LW accounts have all seemed much more competent, fun, and agenty than all of my "normal" friends (most of whom are STEM students at a private university).
I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway)
There have been plenty of Gwern-style research posts on LW, especially given that writing research posts of that nature is quite time-consuming.
I went to an LW meetup once or twice. With one exception the people there seemed less competent and fun than my university friends, work colleagues, or extended family, though possibly more competent than my non-university friends.
Huh, lemme do it.
Schelling fence → bright-line rule
Semantic stopsign → thought-terminating cliché
Anti-inductiveness → reverse Tinkerbell effect
"0 and 1 are not probabilities" → Cromwell's rule
Tapping out → agreeing to disagree (which sometimes confuses LWers when they take the latter literally (see last paragraph of linked comment))
ETA (edited to add) → PS (post scriptum)
That's off the top of my head, but I think I've seen more.
Their approach reduces to an anti-epistemic affect-heuristic, using the ugh-field they self-generate in a reverse affective death spiral (loosely based on our memeplex) as a semantic stopsign, when in fact the Kolmogorov distance to bridge the terminological inferential gap is but an epsilon.
The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn't prime action; concrete construal does.
Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.
Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it's in conflict with other things you need for a successf...
So, the truth value of "rationalists don't win" depends on your definition of "win"
Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.
What is the observed and self-reported opinion on LW about "rationalists don't win"? Lets poll! Please consider the following statements (use your definition of 'win'):
I don't win: [pollid:1023]
Rationality (LW-style) doesn't help me win (by my definition of 'win'): [pollid:1024]
Rationality (LW-style) doesn't help people win (by my definition of 'win'): [pollid:1025]
I think rationalists on average don't win more than on average (by my definition of 'win'): [pollid:1026]
I think the public (as far as they are aware of the concept) thinks that ratio...
Side-stepping the issue of whether rationalists actually "win" or "do not win" in the real world, I think a-priori there are some reasons to suspect that people who exhibit a high degree or rationality will not be among the most successful.
For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor...
Human beings are not very interested in truth in itself. They are mostly interested in it to the extent that it can accomplish other things.
Less Wrongers tend to be more interested in truth in itself, and to rationalize this as "useful" because being wrong about reality should lead you to fail to attain your goals.
But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than...
Let's say I wanted to solve my dating issues. I present the following approaches:
I endeavor to solve the general problem of human sexual attraction, plug myself into the parameters to figure out what I'd be most attracted to, determine the probabilities that individuals I'd be attracted to would also be attracted to me, then devise a strategy for finding someone with maximal compatibility.
I take an iterative approach: I devise a model this afternoon, test it this evening, then analyze the results tomorrow morning and make the necessary adjustments.
W...
"rationalists don't win"
Depends on what 'win' means. If (epistemic) rationality helps with a realistic view of the world, then it also means looking behind the socially constructed expectations of 'life success'. I think these are memes that our brains pattern match to something more suitable in an ancestral environment. Hedonic treadmill and peter principle ensue. I think that a realistic view of the world has helped me evade these expectations and live an unusual fulfilled interesting life. I'd call that a private success. Not a public one....
My life got worse after I found LessWrong, but I can't really attribute that to a causal relationship. I just don't belong in this world, I think.
I can imagine LW-style rationality being helpful if you're already far enough above baseline in enough areas that you would have been fairly close to winning regardless. (I am now imagining "baseline" as the surface of liquids in Sonic the Hedgehog 1-3. If I start having nightmares including the drowning music, ... I'll... ... have a more colorful way to describe despair to the internet, I guess.)
I agree. First off, I think it has a lot to do with a person's overall definition of 'win'. In the eyes of 'outside society' rationalists don't win. I believe that is because, as you said, if you look at things overall, you don't see an influx of success for the people who are rationalists. That isn't to say that they don't win, or that rationalism is pointless and does't offer up anything worthwhile. That would be a lie. I think that rationalists have a better grip on the workings of the world, and thenceforth, know what to do and how to achieve success, ...
I am not so sure that rationalists don't win, but rather that "winning"(ie. starting a company, being a celebrity, etc.) is rare enough and that few people are rationalists that people that win tend not to be rationalists because being a rationalist is rare enough that very few people that win are rationalists, even if each rationalist has a better chance of winning.
So you say altruism is something to "assume that we mostly agree on, and thus not really elaborate" and I know the sentiment is sometimes that it's like jazz and pornography, but fwiw I'd be curious about an elaboration. I don't think that particular prejudice is a big part of rationalist failures, but raising the possibility of it being a part is interesting to me.
What you less wrong folks call "rationality" is not what everyone else calls "rationality" - you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology, depending on your interpretation. Please stop saying "rationality" and meaning your own in-group thing, it's ridiculously offputting.
Also, my experience has been that CFAR-trained folks do sit down and do hard things, and that people who are only familiar with LW just don't. It has also been ...
Here are my thoughts on the "Why don't rationalists win?" thing.
Epistemic
I think it's pretty clear that rationality helps people do a better job of being... less wrong :D
But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.
I'm not really elaborating here because I assume that this is something that we agree on.
However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.
Altruistic
I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:
For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:
But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.
Success
I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.
I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...".
I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.
But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.
Happiness
I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.
Luke talked about the correlates of happiness in How to Be Happy:
One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.
Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:
Social
Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.
Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".
Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?
*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.
Failure?
So, are we a failure? Is everything moot because "rationalists don't win"?
I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has
A LOT
of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?
(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)