Here are my thoughts on the "Why don't rationalists win?" thing.

Epistemic

I think it's pretty clear that rationality helps people do a better job of being... less wrong :D

But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.

I'm not really elaborating here because I assume that this is something that we agree on.

However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.

Altruistic

I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:

  • I'm going to assume that we mostly agree on, and thus not really elaborate.
  • I think deserves to be noted and given credit.
  • Is useful.

For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:

  1. High level.
  2. Likely to be realized far in the future.
  3. Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).

But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.

Success

I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.

I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...". 

I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.

But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.

  1. Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
  2. Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
  3. Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
  4. Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
My point in this section is that rationality can help with success, but 1-4 are also extremely important, and probably act as a limiting factor for most of us (I'd guess that most people here are rational enough such that 1-4 probably acts as a barrier to their success, and marginal increases in rationality probably won't have too big a marginal impact).

(I also bet that 1-4 is insufficient and that there are important things I'm missing.)

Happiness

I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.

Luke talked about the correlates of happiness in How to Be Happy:

Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18

One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.

Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:

  • Things like social and romantic relationships are tremendously important factors in one's happiness. So is work satisfaction (in brief: autonomy, mastery and purpose).
  • These are things that you could certainly get without rationality. Non-rationalists, have set a somewhat high bar for us to beat.
  • Rationality certainly COULD do wonders in this area.
  • But the art hasn't progressed to this point yet. Doing so would be difficult. People have been trying to figure out the secrets of happiness for 1000s of years, and though I think we've made some progress, we still have a long way to go.
  • Currently, I'm afraid that rationality might be acting as a memetic immune disorder. There's a huge focus on our flaws and how to mitigate them, and this leads to a lot of mental energy being spent thinking about "bad" things. I think (and don't know where the sources are) that a positive/optimistic outlook plays a huge role in happiness. "Focusing on the good." Rationality seems to focus a lot on "the bad". Rationality also seems to make people feel unproductive and wrong for not spending enough time focusing on and fixing this "bad", and I fear that this is overblown and leads to unnecessary unhappiness. At the same time, focusing on "the bad" is important: if you want to fix something, you have to spend a lot of time thinking about it. Personally, I struggle with this, and I'm not sure where the equilibrium point really is.

Social

Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.

Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".

Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?

*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.

Failure?

So, are we a failure? Is everything moot because "rationalists don't win"?

I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has

A LOT

of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?


(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)

New Comment
118 comments, sorted by Click to highlight new comments since: Today at 4:12 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I've never really understood the "rationalists don't win" sentiment. The people I've met who have LW accounts have all seemed much more competent, fun, and agenty than all of my "normal" friends (most of whom are STEM students at a private university).

I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway)

There have been plenty of Gwern-style research posts on LW, especially given that writing research posts of that nature is quite time-consuming.

I went to an LW meetup once or twice. With one exception the people there seemed less competent and fun than my university friends, work colleagues, or extended family, though possibly more competent than my non-university friends.

4the gears to ascension9y
That was also true for me until I moved to the bay. I suspect it simply doesn't move the needle much, and it's just a question of who it attracts.
4drethelin9y
I have the opposite experience! Most people at LW meetups I've been to have tended to be succesful programmers or people with or working on stuff like math phds. Generally more socially awkward but that's not a great proxy for "competence" in this kind of crowd.
2[anonymous]9y
Do you think this was caused by their rationality? It seems more likely to me that these people are drawn to rationality because it validates how they already think.
0the gears to ascension9y
What you just said doesn't make sense. "Rationality", as formally defined by this community, refers to "doing well" (which I contest, but whatever); Therefore, the question is not "was it caused by their rationality", but "was it caused by a lack of rationality", or perhaps "Was their lack of rationality caused by using LW techniques?".
1[anonymous]9y
Defining rationality as winning is useless in most discussions. Obviously what I was referring to is rationality as defined by the community, EG "extreme epistemic rationality".
2TheAncientGeek9y
The community defines rationality as epistemic rationality AND as winning, and not noticing the difference between the two leeds to the idea that rationalists ought to win at everything...that the winningness of instrumental rationality. and the universality of ER can be combined.
6TheAncientGeek9y
There are plenty of researchers who have never heard of lesswrong, but who manage to produce good work in fields lesswrong respects. So have they... A over come their lack of rationality, ..or... B learnt rationality somewhere else? And, if the answer is B, what is LW adding to rationality? The assumption that rationality will make you good at range of things that aren't academic or technical?
6[anonymous]9y
The answer is, of course, (B), but what LW adds is a common vocabulary and a massive compilation of material in one place. Most people who learn how to think from disparate sources have a hard time codifying what they understand or teaching it to others. Vocabulary and discourse help immensely with that. So, for instance, I can neatly tell people, "Reversed stupidity is not intelligence!", and thus save myself incredible amounts of explanation about how real life issues are searches through spaces for tiny sub-spaces encoding solutions to your problem, and thus "reversing" some particularly bad solution hasn't done any substantial work locating the sub-space I actually wanted.
1TheAncientGeek9y
It only creates a common vocabulary amongst a subculture. LW vocabulary relabels a lot of traditional rationality terms.
7Vaniver9y
Has anyone put together a translation dictionary? Because it seems to me that most of the terms are the same, and yet it is common to claim that relabeling is common without any sort of quantitative comparison.

Huh, lemme do it.

Schelling fencebright-line rule

Semantic stopsignthought-terminating cliché

Anti-inductivenessreverse Tinkerbell effect

"0 and 1 are not probabilities"Cromwell's rule

Tapping out → agreeing to disagree (which sometimes confuses LWers when they take the latter literally (see last paragraph of linked comment))

ETA (edited to add) → PS (post scriptum)

That's off the top of my head, but I think I've seen more.

8ScottL9y
Thanks for this. Let me know if you have any others and I will add them to this wiki page I created: Less Wrong Canon on Rationality. Here are some more that I already had. * Fallacy of gray → Continuum fallacy * Motivated skepticism → disconfirmation bias * Marginally zero-sum game → arms race
5[anonymous]9y
Funging Against -> Considering the alternative Akrasia -> Procrastination/Resistance Belief in Belief -> Self-Deception Ugh Field ->Aversion to (I had a better fit for this but I can't think of it now)
4Vaniver9y
Thanks for the list! I am amused by this section of Anti-Inductiveness in this context, though:
1TheAncientGeek9y
Instrumental/terminal = hypothetical/categorical rationalist taboo = unpacking.
0[anonymous]8y
Instrumental and terminal are pretty common terms. I've seen them in philosophy and business classes.
4Viliam9y
It was many times debated on LW whether LW needlessly invents new words for already existing terms, or whether the new words label things that are not considered elsewhere. I don't remember the outcomes of those debates. It seems to me they usually went like this: "LW invents new words for many things that already have standard names." "Can you give me five examples?" "What LW calls X is called Y everywhere else." (provides only one example) "Actually X is not the same concept as Y." "Yes it is." "It is not." ... So I guess at the end both sides believe they have won the debate.
2Jiro8y
I just ran into this one because it became used in a reddit thread: in this post Eliezer uses the term "catgirl" to mean a non-sentient sexbot. While that isn't a traditional rationality term, I think it fits the spirit of the question (and predictably, many people responded to the Reddit thread using the normal meaning of "catgirl" rather than Eliezer's.)
2satt9y
Previously.
0btrettel9y
RationalWiki discusses a few: In my view, RationalWiki cherry picks certain LessWrongers to bolster their case. You can't really conclude that these people represent LessWrong as a whole. You can find plenty of discussion of the terminology issue here, for example, and the way RationalWiki presents things makes it sound like LessWrongers are ignorant. I find this sort of misrepresentation to be common at RationalWiki, unfortunately.

Their approach reduces to an anti-epistemic affect-heuristic, using the ugh-field they self-generate in a reverse affective death spiral (loosely based on our memeplex) as a semantic stopsign, when in fact the Kolmogorov distance to bridge the terminological inferential gap is but an epsilon.

4Good_Burning_Plastic9y
You know you've been reading Less Wrong too long when you only have to read that comment twice to understand it.
2XFrequentist9y
I got waaay too far into this before I realized what you were doing... so well done!
1Kawoomba9y
What are you talking about?
0nyralech9y
I'm afraid I don't know what you mean by Kolmogorov distance.
0[anonymous]9y
Well yes. And I fully support LW moving towards more ordinary terminology. But it's still good to have someone compiling it all together.
1Adam Zerner9y
I feel rather confident in saying that it's (A). I think that domain specific knowledge without rationality can actually lead to a lot of success.
-1TheAncientGeek9y
You actually think someone irrational can do maths, science or engineering?
6Adam Zerner9y
Yes, absolutely. Aren't there a bunch of examples of STEM PhD's having crazy beliefs like "evolution isn't real"? Actually, that example in particular was about a context that is outside the laboratory, but I also think that people can be irrational inside the laboratory, but still successful. Ex. they might do the right things for the wrong reasons. Ex. "this is just the way you're supposed to do it". That approach might lead to success a lot of the time, but it isn't a true model of how the world works. (Ultimately, the point I'm making is essentially the same as that Outside The Laboratory post.)
0TheAncientGeek9y
Not a very big bunch. Since (instrumental) rationality is winning, and these people are winning within their own domains, they are being instrumental rationalists within them. So the complaint that they are not rational enough amounts to the complain that they are not epistemic rationalists, ie they don't care enough about truth outside their domains. But why should they? Epistemic rationality doesn't deliver the goodies, in terms of winning .. that's the message of your OP, or it would be if you distinguished ER and IR. Those who value truth for its own sake, the Lovers of Wisdom, will want to become epistemic rationalists, and may well acquire the skills without MIRI or CFAR's help, since epistemic rationality is not a new invention. The rest have the problem that they are not motivated, not the problem that they are not skilled (or, rather, that they lack the skills needed to do things they are not motivated to do). I can see the attraction in "raising the rationality waterline" , since people aren't completely pigeonholed into domains, and do make decisions about wider issues, particularly when they vote. MIRI conceives of raising the rationality waterline in terms of teaching skills, but if it amounts to supplementing IR with ER, and it seems that it does, then you are not going to do it without making it attractive. If you merge ER and IR, then it looks like raising the waterline could lead to enhanced winning, but that expectation just leads to the disappointment you and others have expressed. That cycle will continue until "rationalists" realise rationality is more than one thing.
0[anonymous]9y
Or, you could just assume that it wouldn't make sense for Adam Zerner to define winning as failing, so he was referring to rationality as the set of skills that LW teaches.
0TheAncientGeek9y
It's not like winning <--> losing is the only relevant taxis.
0[anonymous]9y
It's definitely relevant to this reasoning: "Since (instrumental) rationality is winning, and these people are winning within their own domains, they are being instrumental rationalists within them." Just assume Adam Zerner wasn't talking about instrumental rationality in this case.
1[anonymous]9y
Heck, I'll bite the bullet and say that applied science, and perhaps engineering, owe more to 'irrational' people. (Not a smooth bullet, to be sure.) My grandfather worked with turbines. He viewed the world in a very holistic manner, one I can't reconcile with rationalism no matter how hard I try. He was an atheist who had no problems with his children being Christians (that I know of). His library was his pride; yet he had made no provisions for its fate after his death. He quitted smoking after his first heart attack, but went on drinking. He thought electrons' orbits were circular, and he repaired circuitry often. He preferred to read a book during dinners than to listen to us talk, to teach us table manners. And from what I heard, he was not half bad at engineering.

The big reason? Construal theory, or as I like to call it, action is not an abstraction. Abstract construal doesn't prime action; concrete construal does.

Second big reason: the affect (yes, I do mean affect) of being precise, is very much negative. Focusing your attention on flaws and potential problems leads to pessimism, not optimism. But optimism is correlated with success, pessimism is not.

Sure, pessimism has some benefits in a technical career, in terms of being good at what you do. But it's in conflict with other things you need for a successf... (read more)

So, the truth value of "rationalists don't win" depends on your definition of "win"

Or the definition of rationalism. Maybe epistemic rationalism never had much to do with winning.

2ragintumbleweed7y
Epistemic rationality isn’t about winning? Demonstrated, context-appropriate epistemic rationality is incredibly valuable and should lead to higher status and -- to the extent that I understand Less-Wrong jargon --“winning.” Think about markets: If you have accurate and non-consensus opinions about the values of assets or asset classes, you should be able to acquire great wealth. In that vein, there are plenty of rationalists who apply epistemic rationality to market opinions and do very well for themselves. Think Charlie Munger, Warren Buffett, Bill Gates, Peter Thiel, or Jeff Bezos. Winning! If you know better than most who will win NBA games, you can make money betting on the games. E.g., Haralabos Voulgaris. Winning! Know what health trends, diet trends, and exercise trends improve your chances for a longer life? Winning! If you have an accurate and well-honed understanding of what pleases the crowd at Less Wrong, and you can articulate those points well, you’ll get Karma points and higher status in the community. Winning! Economic markets, betting markets, health, and certain status-competitions are all contexts where epistemic rationality is potentially valuable. Occasionally, however, epistemic rationality can be demonstrated in ways that are context-inappropriate – and thus lead to lower status. Not winning! For example, if you correct someone’s grammar the first time you meet him or her at a cocktail party. Not winning! Demonstrate that your boss is dead wrong in front of a group of peers in way that embarrasses her? Not winning! Constantly argue about LW-type topics with people who don’t like to argue? Not winning! Epistemic rationality is a tool. It gives you power to do things you couldn’t do otherwise. But status-games require a deft understanding of when it is appropriate and when it is not appropriate to demonstrate the greater coherence of one’s beliefs to reality to others (which itself strikes me as a form of epistemic rationality of soc
0Lumifer7y
Well, technically speaking, it isn't. It is the propensity to select courses of action which will most likely lead to the outcomes your prefer. Correcting grammar on the first date is not a misapplication of epistemic rationality, it just is NOT epistemically rational (assuming reasonable context, e.g. you are not deliberately negging and you are less interested in grammar than in this particular boy/girl). Epistemic rationality doesn't save you from having bad goals. Or inconsistent ones. ETA: Ah, sorry. I had a brain fart and was writing "epistemic rationality" while meaning "instrumental rationality". So, er, um, disregard.
0jwoodward487y
(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.) Epistemic rationality is not equivalent to "being a Spockish asshole." It simply means that one values rationality as an end and not just a means. If you do not value correcting people's grammar for its own sake, then there is no reason to correct someone's grammar. But that is an instrumental statement, so I suppose I should step back... If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of "this ought to be done" is instrumental. Epistemic only covers "this is true/false."
0Lumifer7y
Yes, of course. Notably, epistemic rationality only requires you to look for and to prefer truth. It does not require you to shove the truth you found into everyone else's face. One can find edge cases, but generally speaking if you treat epistemic rationality narrowly (see above) I would expect such a disagreement to arise very rarely. On the other hand there are, as usual, complications :-/ For example, you might not go find the truth because doing this requires resources (e.g. time) and you feel these resources would be better spent elsewhere. Or if you think you have difficulties controlling your mind (see the rider and the elephant metaphor) you might find useful some tricks which involve deliberate denial of some information to yourself.
0TheAncientGeek7y
So how does it differ from instrumental rationality?
0Lumifer7y
See ETA to the comment.
0Elo7y
I think this is a bad example. The example seems like an instrumental example. Epistemic alone would have you correct the grammar because that's good epistemics. Instrumental would have you bend the rules for the other goals you have on the pathway to winning.
1jwoodward487y
"See ETA to the comment." Lumifer meant instrumental rationality.
0Elo7y
Comment was before his eta. Ta.
2jwoodward487y
Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.
0hairyfigment7y
How could correcting grammar be good epistemics? The only question of fact there is a practical one - how various people will react to the grammar coming out of your word-hole.
0TheAncientGeek7y
Epistemic rationality isn’t about winning? Valuable to whom? Value and status aren't universal constants. You are pretty much saying that the knowledge can sometimes be instrumentally useful. But that does not show epistemic rationality is about winning.. The standard way to show that instrumental and epistemic rationality are not the same is to put forward a society where almost everyone holds to some delusory belief, such as a belief in Offler the Crocodile god, and awards status in return for devotion. In that circumstance, the instrumental rationalist will profess the false belief, and the epistemic rationalist will stick to the truth. In a society that rewards the pursuit of knowledge for its own sake (which ours does sometimes), the epistemic rationalist will get rewards, but won't be pursuing knowledge in order to get rewards. If they stop getting the rewards they will still pursue knowledge...it is a terminal goal for them....that is the sense in which ER is not "about" winning and IR is. ER is defined in terms of goals. The knowledge gained by it may be instrumentally useful, but that is not the central point.
0ragintumbleweed7y
What I'm saying is that all things being equal, individuals, firms, and governments with high ER will outperform those with lower ER. That strikes me as both important and central to why ER matters. I believe you seem to be saying high ER or having beliefs that correspond to reality is valuable for its own sake. That Truth matters for its own sake. I agree, but that's not the only reason it's valuable. In your society with Offler the Crocodile God, yes, irrational behavior will be rewarded. But the society where devotion to Offler is rewarded over engineering prowess will have dilapidated bridges or no bridges at all. Even in the Offler society, medicine based on science will save more lives than medicine based on Offler's teachings. The doctors might be killed by the high priests of Offler for practicing that way, but it's still a better way to practice medicine. Those irrational beliefs may be rewarded for some short term, but they will make everyone's life worse off as a result. (Perhaps in the land of Offler's high priests, clandestine ER is the wisest approach). If the neighboring society of Rational-landia builds better bridges, has better medical practices, and creates better weapons with sophisticated knowledge of projectile physics, it will probably overtake and conquer Offler's people. In North Korea today, the best way to survive might be to pledge complete loyalty to the supreme leader. But the total lack of ER in the public sphere has set it back centuries in human progress. NASA wasn't just trying to figure out rocket science for its own sake in the 1960s. It was trying to get to the moon. If the terminal goal is to live the best possible life ("winning"), then pursuing ER will be incredibly beneficial in achieving that aim. But ER does not obligate those who seek it to make it their terminal goal.
0TheAncientGeek7y
That is probably true, but not equivalent to your original point. I am not saying it is objectively valuable for its own sake. I am saying an epistemic rationalist is defined as someone who terminally, ie for its own sake, values knowledge, although that is ultimately a subjective evaluation. It's defined that way!!!!!
1ragintumbleweed7y
Forgive me, as I am brand new to LW. Where is it defined that an epistemic rationalist can't seek epistemic rationality as a means of living a good life (or for some other reason) rather than as a terminal goal? Is there an Académie française of rationalists that takes away your card if you use ER as a means to an end? I'm working off this quote from EY as my definition of ER. This definition seems silent on the means-end question. This definition is agnostic on motivations for seeking rationality. Epistemic rationality is just seeking truth. You can do this because you want to get rich or get laid or get status or go to the moon or establish a better government or business. People's motivations for doing what they do are complex. Try as I might, I don't think I'll ever fully understand why my primate brain does it what it does. And I don't think anyone's primate brain is seeking truth for its own sake and for no other reasons. Also, arguing about definitions is the least useful form of philosophy, so if that's the direction we're going, I'm tapping out. But I will say that if the only people the Académie française of rationalists deems worthy of calling themselves epistemic rationalists are those with pure, untainted motivations of seeking truth for its own sake and for no other reasons, then I suspect that the class of epistemic rationalists is an empty set. [And yes, I understand that instrumentality is about the actions you choose. But my point is about motivations, not actions.]
0TheAncientGeek7y
From the wiki:-
0Elo7y
ER vs IR. I am not sure what your question is. I think of ER as sharpening the axe. not sure how many trees I will cut down or when, but with a sharp axe I will cut them down swiftly and with ease. I think of IR as actually getting down to swinging the axe. Both are needed. ER is a good terminal goal because it enables the other goals to happen more freely. Even if you don't know the other goals, having a sharper axe helps you be prepared to cut the tree when you find it.
0satt9y
Upvoted, but I want to throw in the caveat that some baseline level of epistemic rationalism is very useful for winning. Schizophrenics tend to have a harder time of things than non-schizophrenics.
127chaos9y
That is a limitation of looking at this community specifically, but the general sense of the question can also be approached by looking at communities for specific activities that have strong norms of rationality. I think most of the time rationality is not helpful for applied goals because doing something well usually requires domain specific knowledge that's acquired through experience, and yet experience alone is almost always sufficient for success. In cases where the advice of rationality and experience conflict, oftentimes experience wins even if it should not, because the surrounding social context is built by and for the irrational majority. If you make the same mistake everyone else makes you are in little danger, but if you make a unique mistake you are in trouble. Rationality is most useful when you're trying to find truths that no one else has found before. Unfortunately, this is extremely difficult to do even with ideal reasoning processes. Rationality does offer some marginal advantage in truth seeking, but because useful novel truths are so rare, most often the costs outweigh the benefits. Once a good idea is discovered, oftentimes irrational people are simply able to copy whoever invented the idea, without having to bear all the risk involved with the process of the idea's creation. And then, when you consider that perfect rationality is beyond mortal reach, the situation begins to look even worse. You need a strategy that lets you make better use of truth than other people can, in addition to the ability to find truth more easily, if you want to have a decent chance to translate skill in rationality into life victories.
1[anonymous]9y
What is "rationality" even supposed to be if not codified and generalized experience?
1entirelyuseless9y
Yes. This is much like I said in my comment: people from Less Wrong are simply much more interested in truth in itself, and as you say here, there is little reason to expect this to make them more effective in attaining other goals.

What is the observed and self-reported opinion on LW about "rationalists don't win"? Lets poll! Please consider the following statements (use your definition of 'win'):

I don't win: [pollid:1023]

Rationality (LW-style) doesn't help me win (by my definition of 'win'): [pollid:1024]

Rationality (LW-style) doesn't help people win (by my definition of 'win'): [pollid:1025]

I think rationalists on average don't win more than on average (by my definition of 'win'): [pollid:1026]

I think the public (as far as they are aware of the concept) thinks that ratio... (read more)

4Good_Burning_Plastic9y
I wish more surveys had such clarifications. I can never tell whether "Strongly disagree" with "X should be more than Y" means "I strongly believe X should be less than Y", "I strongly believe X should be about the same as Y", or "I strongly believe it doesn't matter whether X is more or less than Y" (and, as a result, what I should pick if my opinion is one of the latter two).
2EngineerofScience9y
I'm not sure how effective this is considering most people who would see this are rationalists and people like to think good of themselves.
2[anonymous]9y
You know, it's hard for me to simultaneously think of someone as winning and not a rationalist, not to mention always correcting the result by 'my definition'. I could say that the confirmation bias is at fault, but really... Shouldn't we just dissolve the questions?:) I mean, suppose I do know a person who has trouble letting sunk causes go, and has probably firmly forgotten about the Bayes theorem, and uses arguments as soldiers on occasion... But she is far more active than me, she keeps trying out new things, seeking out jobs even beyond her experiences etc. Should I consider her rational? I don't know. Brave, yes. Rather smart, yes. Winning, often. But rational?
5Viliam9y
Winning a lottery? (Generalize it to include genetic lottery etc.)
1lmm9y
Many stories I've seen of lottery winners lost the money quickly through bad investments and/or developed major life issues (divorce, drug addiction).
4WalterL8y
I think there's an element of rubbernecking there. The general feeling of the mob is that lottery = tax on stupidity. We are smart to not play the lottery. Story of a winner challenges general feeling, mob feels dumb for not buying winning ticket. Unmet need exists for story to make mob happy again. General form of story is that lottery money is evil money. Lottery winners, far from being better than you, dear reader, are actually worse! They get divorced, they squander the money! Lawsuits!! No one wants to read about the guy who retires and pays off his credit cards. No story there. But there are a lot of lotteries, so there will be an idiot somewhere you can use to reassure your viewers that they are double smart for not being rich.
2Jiro8y
The entire world is a tax on stupidity.
2jwoodward487y
Sounds meaninglessly deep to me.
0Jiro7y
It isn't. It's meant to point out that calling something a 'tax on stupidity" is itself meaninglessly deep-sounding. Intelligence is used for pretty much everything; calling something a tax on stupidity says nothing more about it than "it's part of the world".
0the gears to ascension9y
In my conversations with LW and CFAR community folks, they seem to consider "rationality" to be strictly equal to "winning" - unless I ask them directly if that's true. I think they really could benefit from clearer and simpler words, rather than naming fucking everything after their favorite words.
-2TheAncientGeek9y
Does "rational" have to have meaning? Is that not a way of dissolving the question.
-2[anonymous]9y
Winning = rational, rational = winning. If you define rational as something other than "the intellectual means of winning", there's no point other than a religious fetish for a theorem that's difficult to compute with.
2[anonymous]9y
Then how does one understand 'rationalists don't win'? 'Rationalists expect to win and fail, just like, for example, XYZ-ists do, only rationalists have trained themselves to recognize failure and in this way can still salvage more and so don't lose as completely (though we have no actual measure, because XYZ-ists will still think they have won)?:)
4[anonymous]9y
No, I'd understand it as more like, "Calling oneself a 'rationalist' or 'aspiring rationalist' isn't correlated with object-level winning".
1the gears to ascension9y
The point of the "rationalists win" thing was to define rationality as winning. Which, among other things, makes it very unclear why the word "intelligence" is different. Everyone seems to insist it is in fact different when I ask, but nobody can explain why, and the inductive examples they give me collapse under scrutiny. what?
4hamnox9y
Pretty sure inductive examples of intelligence fail because we really are pointing at different things when we say it. Some mean "shows a statistically higher base rate for acquiring mental constructs (ideas, knowledge, skills)" when they say it. This usage tends to show up in people who think that model-building and explicit reasoning are the key to winning. They may try to tack this consideration onto their definition of intelligence in some way. Some try to point at the specific differences in mental architecture they think cause people to use more or fewer mental constructs, like working memory or ability to abstract. This usage tends to show up in people who are trying to effect useful changes in how they or others think. They may notice that there's a lot of variation in which kind of mental constructs are used, and try to single out the combination that is most important to winning. There's also the social stereotype of who has a preference for "doing" and experiencing vs. who is drawn to "thinking" and planning. People who think "doing" or having a well-integrated System 1 is the key to winning may favor this definition, since it neatly sidesteps away from the stupid argument over definitions the thinkers are having. I like to use it in conversations because it's loose enough to kinda encapsulate the other definitions — which role you think you fit is going to correlate with which you use more, which itself correlates with what your natural abilities lend themselves to. I'm less likely to talk past people that way.. But it's also because of this last interpretation that I point blank refuse to use intelligence as a synonym for rationality. The word 'rational' comes with just as many shades of denying emotion and trusting models over intuition, but they're at least framed as ignoring extraneous factors in the course of doing what you must.
0lmm9y
I want to talk about the group (well, cluster of people) that calls itself "rationalists". What should I call it if not that?
1the gears to ascension9y
CFAR community, or LW community, depending on which kind of person you mean.

Side-stepping the issue of whether rationalists actually "win" or "do not win" in the real world, I think a-priori there are some reasons to suspect that people who exhibit a high degree or rationality will not be among the most successful.

For example: people respond positively to confidence. When you make a sales pitch for your company/research project/whatever, people like to see you that you really believe in the idea. Often, you will win brownie points if you believe in whatever you are trying to sell with nearly evangelical fervor... (read more)

0ChristianKl9y
CFAR's Valentine manages to have a very high charisma. He also manages to get out of his way to tell people not to believe him too much and explicetly that that he's not certain. In http://lesswrong.com/lw/mp3/proper_posture_for_mental_arts/ he suggests: Having this strong sense of something worth to protect seems to be more important than believing that individual ideas are necessarily correct. You don't develop a strong sense of something worth by doing debaising techniques but at the same time it's a major part of rationality!CFAR and rationality!HPMOR. At the same time there are people in this community plagued by akrasia who don't have that strong sense on an emotional level.
0Adam Zerner9y
I agree with your point about the value of appearing confident, and that it's difficult to fake.* I think it's worth bringing up, but I don't think it's a particularly large component of success. Depending on the field, but I still don't think there's really many fields where it's a notably large component of success (maybe sales?). *I've encountered it. I'm an inexperienced web developer, and people sometimes tell me that I should be more confident. At first this has very slightly hurt me. Almost negligibly slight. Recently, I've been extremely fortunate to get to work with a developer who also reads LW and understands confidence. I actually talked to him about this today, and he mirrored my thoughts that with most people, appearing more confident might benefit me, but that with him it makes sense to be honest about my confident levels (like I have been).
2ZoltanBerrigomo9y
Not sure...I think confidence, sales skills, and ability to believe and get passionate about BS can be very helpful in much of the business world.

Let's say I wanted to solve my dating issues. I present the following approaches:

  1. I endeavor to solve the general problem of human sexual attraction, plug myself into the parameters to figure out what I'd be most attracted to, determine the probabilities that individuals I'd be attracted to would also be attracted to me, then devise a strategy for finding someone with maximal compatibility.

  2. I take an iterative approach: I devise a model this afternoon, test it this evening, then analyze the results tomorrow morning and make the necessary adjustments.

W... (read more)

0Richard_Kennaway9y
How would you do (1) without making hypotheses and testing them, i.e. (2)?
0Viliam9y
Reading a book... debating with other smart people in a web forum... reading another book... trying to solve the problems in your map alone before you even touch the territory... Seems to me this is what people often do when they try to do (1).

Human beings are not very interested in truth in itself. They are mostly interested in it to the extent that it can accomplish other things.

Less Wrongers tend to be more interested in truth in itself, and to rationalize this as "useful" because being wrong about reality should lead you to fail to attain your goals.

But normal human beings are extremely good at compartmentalization. In other words they are extremely good at knowing when knowing the truth is going to be useful for their goals, and when it is not. This means that they are better than... (read more)

-1[anonymous]9y
If you really believe this, I'd love to see a post on a computational theory of compartmentalization, so you can explain for us all how the brain performs this magical trick.
2entirelyuseless9y
I'm not sure what you mean by "magical trick." For example, it's pretty easy to know that it doesn't matter (for the brain's purposes) whether or not my politics is objectively correct or not; for those purposes it mainly matters whether I agree with my associates.
0[anonymous]9y
Bolded the part I consider controversial. If you haven't characterized what sort of inference problem the brain is actually solving, then you don't know the purposes behind its functionality. You only know what things feel like from the inside, and that's unreliable. Hell, if normative theories of rationality were more computational and less focused on sounding intellectual, I'd believe in those a lot more thoroughly, too.
1TheAncientGeek9y
If you have some some sort of distributed database with multiple updates from multiple sources, its likely to get into an inconsistent state unless you to measures to prevent that. So the way to achieve the magic of compartmentalised "beliefs" is to build a system like that, but don't bother to add a consistency layer.
-1PhilGoetz9y
Perhaps he will, if you agree to also post your computational theory of how the brain works. If you don't have one, then it's unreasonable to demand one.
1[anonymous]9y
That was several months ago.
0PhilGoetz9y
Nice! I'll bookmark that.
-1TheAncientGeek9y
You think there is no evidence that it does?
0[anonymous]9y
You only really understand something when you understand how it's implemented.
1TheAncientGeek9y
Whatever. The statement "But normal human beings are extremely good at compartmentalization" has little to do with understanding or implementation, so you would seem to be changing the subject.
0[anonymous]9y
Well no. I'm saying that folk-psychology has been extremely wrong before, so we shouldn't trust it. You invoke folk-psychology to say that the mind uses compartmentalization to lie to itself in useful ways. I say that this folk-psychological judgement lacks explanatory power (though it certainly possesses status-attribution power: low status to those measly humans over there!) in the absence of a larger, well-supported theory behind it.
-1TheAncientGeek9y
Is it better to assume non compartmentisation?
3[anonymous]9y
No, it's better to assume that folk-psychology doesn't accurately map the mind. "Reversed stupidity is not intelligence." Your statement is equivalent to saying, "We've seen a beautiful sunset. Clearly, it must be a sign of God's happiness, since it couldn't be a sign of God's anger." In actual fact, it's all a matter of the atmosphere refracting light from a giant nuclear-fusion reaction, and made-up deities have nothing to do with it. Just because a map seems to let you classify things, doesn't mean it provides accurate causal explanations.
0TheAncientGeek9y
If we don't know enough about how the mind works to say it is good at compermentalisation, we also don't know enough to say it is bad at compartmentalisation Your position requires you to be noncommittal about a lot of things. Maybe you are managing that. The analogy with sunsets isn't analogous, because we have the science as an alternative
0Jiro9y
I wouldn't be able to tell if someone is a good mathematician, but I'd know that if they add 2 and 2 the normal way and get 5, they're a bad one. It's often a lot easier to detect incompetence, or at least some kinds of incompetence, than excellence.
-1TheAncientGeek9y
Is compartmentalisation supposed to be a competence or an incompetence, or neither?
-1[anonymous]9y
Personally, I don't think "compartmentalization" actually cuts reality at the joints. Surely the brain must solve a classification problem at some point, but it could easily "fall out" that your algorithms simply perform better if they classify things or situations between contextualized models - that is, if they "compartmentalize" - than if they try to build one humongous super-model for all possible things and situations.
-1TheAncientGeek9y
But you don;t have proof of that theory, do you?
-1[anonymous]9y
Your original thesis would support that theory, actually.
0TheAncientGeek9y
I havent made any object level claims about psychology.

"Local rationalist learns to beat akrasia using this one weird trick!"

"rationalists don't win"

Depends on what 'win' means. If (epistemic) rationality helps with a realistic view of the world, then it also means looking behind the socially constructed expectations of 'life success'. I think these are memes that our brains pattern match to something more suitable in an ancestral environment. Hedonic treadmill and peter principle ensue. I think that a realistic view of the world has helped me evade these expectations and live an unusual fulfilled interesting life. I'd call that a private success. Not a public one.... (read more)

My life got worse after I found LessWrong, but I can't really attribute that to a causal relationship. I just don't belong in this world, I think.

I can imagine LW-style rationality being helpful if you're already far enough above baseline in enough areas that you would have been fairly close to winning regardless. (I am now imagining "baseline" as the surface of liquids in Sonic the Hedgehog 1-3. If I start having nightmares including the drowning music, ... I'll... ... have a more colorful way to describe despair to the internet, I guess.)

I agree. First off, I think it has a lot to do with a person's overall definition of 'win'. In the eyes of 'outside society' rationalists don't win. I believe that is because, as you said, if you look at things overall, you don't see an influx of success for the people who are rationalists. That isn't to say that they don't win, or that rationalism is pointless and does't offer up anything worthwhile. That would be a lie. I think that rationalists have a better grip on the workings of the world, and thenceforth, know what to do and how to achieve success, ... (read more)

We are the people who knew too much.....

I am not so sure that rationalists don't win, but rather that "winning"(ie. starting a company, being a celebrity, etc.) is rare enough and that few people are rationalists that people that win tend not to be rationalists because being a rationalist is rare enough that very few people that win are rationalists, even if each rationalist has a better chance of winning.

So you say altruism is something to "assume that we mostly agree on, and thus not really elaborate" and I know the sentiment is sometimes that it's like jazz and pornography, but fwiw I'd be curious about an elaboration. I don't think that particular prejudice is a big part of rationalist failures, but raising the possibility of it being a part is interesting to me.

0Adam Zerner9y
I just meant that rationalists overwhelmingly seem to have altruistic goals. I'm not sure what you meant with "jazz and pornography".
0ChristianKl9y
A popular definition for what happens to be pornography is "I know it when I see it." There seem to be a similar sentiment with jazz.

What you less wrong folks call "rationality" is not what everyone else calls "rationality" - you can't say "I also think that rationality is doing a great job in helping people", that either doesn't make sense or is a tautology, depending on your interpretation. Please stop saying "rationality" and meaning your own in-group thing, it's ridiculously offputting.

Also, my experience has been that CFAR-trained folks do sit down and do hard things, and that people who are only familiar with LW just don't. It has also been ... (read more)

1PhilGoetz9y
He may have meant that he thinks rationality is effective for altruists.