I've noticed that here on Less Wrong people often identify themselves as rationalists (and this community as a rationalist one -- searching for 'rationalist' on the site returns exactly 1000 hits).  I'm a bit concerned that this label may work against our favour.

Paul Graham recently wrote a nice essay Keep Your Identity Small in which he argued that identifying yourself with a label tends to work against reasonable -- rational, you might say -- disscusions about topics that are related to it.  The essay is quite short and if you haven't read it I highly reccommend doing so.

If his argument is correct, then identifying with a label like Rationalist may impede your ability to be rational.

My thinking is that once you identify yourself as an X, you have a tendancy to evaluate ideas and courses of action in terms of how similar or different they appear to your prototypical notion of that label - as a shortcut for genuinely thinking about them and instead of evaluating them on their own merits. 

Aside from the effect such a label may have on our own thinking, the term 'rationalist' may be bad PR.  In the wider world 'rational' tends to be a bit of a dirty word.  It has a lot of negative connotations.   

Outside communities like this one, presenting yourself a rationalist is likely to get other people off on the wrong foot.  In many people's minds, it'd strike you out before you'd even said anything.  It's a great way for them to pigeonhole you.  

And we should be interested in embracing the wider world and communicating our views to others.

If I was to describe what we're about, I'd probably say something like that we're interested in knowing the truth, and want to avoid deluding ourselves about anything, as much as either of these things are possible.  So we're studying how to be less wrong.  I'm not sure I'd use any particular label in my description.

Interestingly, those goals I described us in terms of -- wanting truth, wanting to avoid deluding ourselves -- are not really what separates "us" from "them".  I think the actual difference is that we are simply more aware of the fact that there are many ways our thinking can be wrong and lead us astray.  

Many people really are -- or at least start out -- interested in the truth, but get led astray by flawed thinking because they're not aware that it is flawed.  Because flawed thinking begets flawed beliefs, the process can lead people onto systematic paths away from truth seeking.  But I don't think even those people set out in the first place to get away from the truth.

The knowledge our community has, of ways that thinking can lead us astray, is an important thing we have to offer, and something that we should try to communicate to others.  And I actually think a lot of people would be receptive to it, presented in the right way. 

 

New Comment
34 comments, sorted by Click to highlight new comments since:

I've never described myself as "a rationalist" to anyone outside the LW community. For local purposes, I think "rationalists" is a short, simple, and suitably evocative term for the lengthier "people who care about avoiding common pitfalls of thought that lead to false belief". For outside purposes, if I want to inform people of my LW-specific affiliation, I e-mail them a link to the website or an individual article. If I just want to tell them that I'm interested in believing true things, I signal about my commitment to honesty, my curiosity, or my intelligence.

Heavily paraphrasing:

For local purposes [“rationalists” seems suitable]. For outside purposes [I use a description not a label]

I think it’s pretty much impossible for us to have any sort of private label for ourselves. Even if we were to use a label for ourselves within this site and never use that outside of the site, that use of it within the site is still going to be projecting that label to the wider world.

Anyone from outside the community who looks at the site is going to see whatever label(s) we employ. And even if we employ a label just on this site, it’s still likely to be part of the site’s “reputation” in outside circles -- i.e. the label is still likely to reach people who've never seen the site.

A lot of the content on Less Wrong is describing various types of mental mistakes (biases and whatnot). In terms of this aspect of the site, Less Wrong is like a kind of Wikipedia for mental mistakes.

As with Wikipedia, it’s something that could be linked to from elsewhere – like if you wanted to use it help explain type of mistake to someone. There’s a lot of potential for using the site in this way, considering that the internet consists in a large part of discussions and discussions always involve some component of reasoning.

Seen in this way, the site is not just a community (who could have their own private terminology) but also an internet-wide resource. So we should think of any label as global, and I think that's more of a reason to consider having no label at all.

Here's an example of such external referencing of Less Wrong posts

http://www.37signals.com/svn/posts/1750-the-planning-fallacy

[-]Jack10

Yeah, this is basically the only option right now isn't it? The question of how to identify ourselves to people who don't read the site is totally irrelevant until the site gains serious notoriety. I can think of maybe three internet communities that can successfully be identified by using names they gave themselves and for the most part they aren't flattering comparisons. I don't think lesswrong dot com is on the verge of leading a global movement that would require a pithy label.

All that matters then is whether or not it is convenient and whether or not using "rationalist" is deterring new readers from sticking around. Rationalist is definitely is a convenient word to use but I don't think it is indispensable. It also looks like we have medium-to-strong anecdotal evidence that using it might be deterring potential new users (on the other hand it could be attracting other users and we have no particular reason to expect such people to point to the use of the word rationalist as a reason for them sticking around).

I think the answer is probably that we have already spent to much time thinking about this question and that there is no way to legislate what we call ourselves. If people want to keep "rationalist" from becoming an entrenched label they should just keep using terms other than "rationalist" so that no word ever becomes the official label.

[-]Gris90

I really liked this piece. I’ve been a lurker for a long time, and it’s inspired me to toss up a comment. I think the larger issue here, rather than the specific label we choose to use, is labeling ourselves at all. 'Rationalist', despite its connotations, does an adequate job at summing up our purposes. I agree with Alicorn - we know what we mean when we say it around these parts, and employing the world as a label outside of our community in most circumstances isn’t something I’d ever do without proper context.

If our mission objective is to be less wrong, overcome biases, and find rational solutions then it is a given that we should be ever cautious of groupthink. In many cases, I’d argue we are very successful at that by constantly challenging our assumptions and beliefs to get at the core of problems. At the same time, any label such as “rationalists” automatically, whether we intend it or not, creates an us-and-them false dichotomy. We wouldn’t be in this community unless we believed our approach to life’s complex situations was superior, but that’s only more reason to be extremely cautious of developing a bias.

I work a lot with academics. If we take one specific school of thought, or even one department in a university, we will likely have a group of individuals dedicated to very high level thought. Yet, as time progresses regardless of the goal, biases are internalized and groupthink occurs. Knowledge is passed down and scrutinized until “schools” of thought splinter off into more and more specialized disciplines. Meanwhile, the original assumptions are forgotten or taken as fact. I find a place like Less Wrong liberating compared to the academic discussions I’m tied into. Here we have individuals from all walks of life and with different expertise contributing to a greater understanding.

So, without boring you folks too much longer, as an example relevant to our context, I would point to violence’s place in LW’s rationality. I’m not a particularly violent person, but in several posts I’ve read over the past months, violence as a rational strategy is hardly ever considered. Even when it is mentioned in the comments, it’ll be downvoted and disregarded almost on principle. It reminds me of the Hawks and Doves scenario. If we as a group (and I realize I’m already over-generalizing by speaking for the group) internalize non-violent solutions into our identity as ‘rationalists’ believing that higher-learning will always conquer brute force, not only do we run the risk of being out-strategized in life, we run the risk of undermining everything we stood for in the first place.

I have been looking at this site everyday for some time. I have to say that the label 'rationalist' has made me suspicious and on guard. I have so far found nothing to confirm my suspicions and many interesting postings. Perhaps I will soon lose the feeling of suspicion. But every single time that the word 'rationalist' is used, I see the ghost of the hated Ayn Rand in the background. If you don't want to be suspected of a Rand sort of philosophy then I think you would be wise to coin another term.

I changed the title of my post from "Mate selection for the male rationalist" to "Mate selection for the men here".

Re violence, do see "Bayesans vs. Barbarians."

I hope to post on that post shortly, after giving it some thought.

Gris, just as bias against violence may be the reason it's hardly ever considered, alternatively, it may not only be a rational position, but a strategically sensible one. Please consider looking at the literature concerning strategic nonviolence. The substantial literature at the Albert Einstein Institute is good for understanding nonviolent strategy and tactics against regimes, and the insights provided translate into courses of action in other conflicts, as well.

I agree that identifying yourself with the label rationality in such a manner that it interferes with acutal clear thinking is a grievous failure mode that people should be warned against. But it still seems useful to have some sort of terminology to talk about clear thinking, and I can't think of a better candidate term than rationality. I must say that I can't help but find it odd that you link to "Keep Your Identity Small" in discussing this problem. Did you read the footnotes? Graham lists that which we would call rationality as one of the few things you should keep in your identity:

[2] There may be some things it's a net win to include in your identity. For example, being a scientist. But arguably that is more of a placeholder than an actual label—like putting NMI on a form that asks for your middle initial—because it doesn't commit you to believing anything in particular. A scientist isn't committed to believing in natural selection in the same way a bibilical literalist is committed to rejecting it. All he's committed to is following the evidence wherever it leads.

Considering yourself a scientist is equivalent to putting a sign in a cupboard saying "this cupboard must be kept empty." Yes, strictly speaking, you're putting something in the cupboard, but not in the ordinary sense.

Graham says scientist where I would say aspiring rationalist, but it's essentially the same idea: believe true things, and use this true knowledge to do the best things, whatever they may be. Eliezer's "Twelve Virtues" contain the same warning as yours and Graham's:

How can you improve your conception of rationality? Not by saying to yourself, “It is my duty to be rational.” By this you only enshrine your mistaken conception. Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue. If you think: “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.

Do not ask whether it is “the Way” to do this or that. Ask whether the sky is blue or green. If you speak overmuch of the Way you will not attain it.

You may try to name the highest principle with names such as “the map that reflects the territory” or “experience of success and failure” or “Bayesian decision theory”. But perhaps you describe incorrectly the nameless virtue. How will you discover your mistake? Not by comparing your description to itself, but by comparing it to that which you did not name.

I do find it useful to self-identify as an aspiring rationalist, rather than simply as a rationalist, to disclaim any implications of having achieved some grand end state of rationality. I know that I'm still wrong---just hopefully less so than previously.

I agree that identifying yourself with the label rationality … But it still seems useful to have some sort of terminology to talk about clear thinking, and I can't think of a better candidate term than rationality.

‘Rationality’ is a perfectly fine term to talk about clear thinking, but that is quite a different matter to using 'rationalist' or any other term as a label to identify with.

I must say that I can't help but find it odd that you link to "Keep Your Identity Small" in discussing this problem. Did you read the footnotes? Graham lists that which we would call rationality as one of the few things you should keep in your identity:

He doesn’t quite say it’s a label you should keep in your identity, he lists it as an example of something that might be good to keep in your personal identity. I think that the argument he outlines in the essay applies to what’s in that footnote: that it’d be better to just want to “[follow] evidence wherever it leads”, than to identify too strongly as a scientist.

I think the danger here is far smaller than people are making it out to be. There is a major difference between the label "rationalist" and most other identities as Paul Graham refers to them. The difference is that "rationalist" is a procedural label; most identities are at least partially substantive, using procedural/substantive in the sense that the legal system does.

"Rationalist," which I agree is an inevitable shorthand that emerges when the topic of overcoming bias is discussed frequently, is exclusively a procedural label: such a person is expected to make decisions and seek truth using a certain process. This process includes Bayesian updating of priors based on evidence, etc. However, such a label doesn't commit the rationalist to any particular conclusion ex ante: the rationalist doesn't have to be atheist or theist, or accept any other fact as true and virtually unassailable. He's merely committed to the process of arriving at conclusions.

Other identities are largely substantive. They commit the bearer to certain conclusions about the state of the world. A Christian believes in a god with certain attributes and a certain history of the world. A Communist believes that a certain government system is better than all others. These identities are dangerous: once they commit you to a conclusion, you're unlikely to challenge it with evidence to ensure it is in fact the best one. That's the kind of identity Paul Graham is warning against.

Of course, these labels have procedural components: a Christian would solve a moral dilemma using the Bible; a Communist would solve an economic problem using communist theory. Similarly, rationalism substantively means you've arrived at the conclusion than you're biased and you can't trust your gut or your brain like most people do, but that's the extent of your substantive assumptions.

Since rationalism is a procedural identity rather than a substantive one, I see few of the dangers of using the term "rationalist" freely here.

[-]Jack30

Why should we think of beliefs about proper procedure as less prone to reifying identity formation than beliefs about things other than procedures? How are beliefs about the best procedure for reasoning or predicting not beliefs about the state of the world? Specifically, are such beliefs not beliefs about the human brain and how it functions? Aren't we all pretty committed to the view that updating priors is a better way of getting things right than praying for the answer? I don't see why beliefs about procedure aren't just as liable to be let by as unchallenged assumption as are beliefs about political systems.

Besides, we'd be kidding ourselves if we said that the less wrong community has no shared beliefs other than about procedure. Yeah, a rationalist doesn't have to be an atheist... but there aren't a lot of outspoken evangelicals around these parts. It remains very possible that some or most of us could come to associate other beliefs with the rationalist label, even if the label doesn't explicitly include them right now.

There are lots of reasons to call ourselves rationalists- but lets try not to dupe ourselves into thinking we're so special none of the problems with labeling will apply to us.

I'm inclined to agree on your latter point: looking at the results of the survey, it seems like it would be easy to go from 'rationalist' as a procedural label to 'rationalist' as shorthand for 'atheist male computer programmer using bayesian rules.' Of course, that's a common bias, and I think this community is as ready as any to fight it.

As for the former, I tried to address that by pointing out that rationalism means that we've already decided that updating priors is more effective than prayer. That said, I have a perhaps idealistic view of rationality, in that I think it's flexible enough to destroy itself, if necessary. I'd like to think that if we learned that our way of reasoning is inferior, we'd readily abandon it. A little too idealistic, perhaps.

That said, I will say that I find purely procedural labels less dangerous than substantive ones. You've alluded to the danger of conflating it with substantive labels like atheism, but that's a separate danger worth looking out for.

[-]Jack10

So it might be the case that bayesian updating has some quirky memetic mutation that could lead it to destroy itself if it stopped working. Maybe so-called 'rationalism' is especially bad at absorbing internal contradictions. But this would be a feature of they belief itself-- not a feature of it being a belief about procedure. Many beliefs about procedure are exactly the opposite-- take believing that truth can be taken from the Bible. That procedure is self-justifying and there is no way to dispute it from within the assumptions of the procedure.

Mostly, I just don't think the distinction you are trying to make between "procedural" and "substantive" beliefs holds water. Beliefs about political theory and economics, for example, are almost all procedural beliefs (i.e. the right procedure for making a law or stimulating the economy). What about them would make them immune to labeling problems?

"Many beliefs about procedure are exactly the opposite-- take believing that truth can be taken from the Bible. That procedure is self-justifying and there is no way to dispute it from within the assumptions of the procedure."

That's my point about rationality - the way I think about it, it would catch its own contradictions. In essence, a rationalist would recognize that rationalists don't "win." So as a result, committing yourself to rationality doesn't actually commit you to an outcome, as perhaps following a scripture would.

The bigger problem, I believe, is that most professed commitment to a procedure is superficial, and that instead most people simply bend the procedure to a preferred outcome. "The Devil may cite scripture for his purpose." The key, of course, is following the procedure accurately, and this is the community that'll keep you in line if you try to bend procedure to your preferred conclusion.

"So as a result, committing yourself to rationality doesn't actually commit you to an outcome, as perhaps following a scripture would."

Doesn't committing yourself to rationality commit you to the outcome that so and so "will be rational"? I'm not saying that this is the same exact thing as what evangelical christians do, where they actually twist the lines to reason to their preferred conclusion. But it's like Jack said, don't dupe yourself into thinking none of the problems with labeling will apply to you. That's where you get into a tricky place, because you are ignoring a piece of information that does not jibe with your preferred view of yourself.

Some people here have argued that 'rationalist' refers to someone who wants to act rationally, as opposed to someone who actually puts the necessary techniques into practice.

I think the danger is far greater than you suspect.

For a discussion of other terms to use instead of rational, see What's in a name?.

I still hesitate to apply the term "rationalist" to myself and others -- after all the time I spent on LW, the term still feels monolithic to me for some reason. When I use it, I almost always find myself mentally expanding it into something like "one who seeks to attain an accurate map of reality" and/or "one who is able to efficiently steer the future into desired regions".

Interestingly, those goals I described us in terms of -- wanting truth, wanting to avoid deluding ourselves -- are not really what separates "us" from "them".

I'm not sure if that's true. Everyone says they want the truth, but often reveal though their actions that it's pretty low on the priority list. Perhaps we should say that we want truth more than most people. Or that we don't believe we can get away with deceiving ourselves without paying a terrible price.

I disagree with this. Of course, not everyone places seeking truth as their highest priority. (A certain kind of mindless hedonist , perhaps.) But when you say, "everyone says they want the truth, [...], but it's pretty low on the priority list" you are confusing wanting "truth" with wanting the beliefs you consider to be true. In other words, your version of the truth is low on their priority list. You don't have to be relativistic about what truth is, but I think it is a false belief to think that people don't believe their beliefs are true.

I would also like to add that confusion about beliefs seems to be a common human state, at least transiently. It is too negative to call the state when a person has conflicting, inconsistent beliefs "delusional". Sometimes life teaches a person that this state is impossible to get out of -- I hope this is a false belief -- and they become complacent about having some subset of beliefs that they know are false. This complacence (really, a form of despair) is the closest example I can think of for a person not wanting truth.

you are confusing wanting "truth" with wanting the beliefs you consider to be true.

What a presumptuous, useless thing to say. Why don't you explain how you've deduced my confusion from that one sentence.

Apparently you think I've got a particular truth in mind and I'm accusing those who disagree with me of deprioritizing truth. Even if I was, why does that indicate confusion on my part? If I wanted to accuse them of being wrong because they were stupid, or of being wrong because they lacked the evidence, I would have said so. I'm accusing them of being wrong because it's more fun and convenient than being right. Seeing as how you don't know any specifics of what the argument is about, on what basis have you determined my confusion?

But actually I didn't have a particular controversy in mind. I'm claiming people deprioritize truth about smaller questions than "is there a god", or "does socialism work". I'm guessing they deprioritize truth even on things that are much closer to home, like "am i competent?", or "do people like me", or "is my company on the path the success?"

Come to think of it, that sounds quite testable. I wonder if anyone's done an experiment....

OK. I thought I was arguing with another version of "if you're not rational, then you don't value truth". That was presumptuous. And you're right, there is this other category of being rather indifferent or careless with respect to the truth, especially if the truth may be unpleasant or require work. I observe I have a knee jerk reaction to defend the "them" group whenever there is any kind of anti-"other-people" argument... and it is not my intention to be an indiscriminate bleeding-heart defender, so I need to consider this.

"Interested in rationality"

The problem is language. If you use a concept frequently, you pretty much need a shorthand way of referring to it. "Mate selection for the male who values the use of a properly weighted Bayesian model in the evaluation of the probability of phenomena" would not make a very effective post title. Moreover, it wouldn't communicate as effectively. "Mate selection for the male rationalist" tells you, immediately, that it is directed at a specific type of person with a fairly specific mode of thinking, and that it (probably) addresses him in this mode of thinking (since "rationalist" is a reasonably well understood term around these parts). The longer one doesn't communicate all of that.

The real challenge, rather than disparaging "rationalist," which, I agree, has some definite connotative problems, is to come up with another term. I personally have no suggestions, but I do have some meta suggestions.

-It should be short and as non-esoteric as possible. One word is ideal, two short words is probably maximum.

-It should avoid negative connotations and strongly positive ones (calling oneself, e.g., "bright" is rather off-putting by its implications for outsiders).

-It need not map directly to rationality or any such related concept. The Republicans and Democrats are not fundamentally about republicanism or democracy, and they manage just fine.

-That's about all I can think of.

This is actually a PR issue worthy of thought. The term "rationalist" may be rather off-putting for someone new to the site, and, given how society works, if this system of thought develops a sufficient following, it's going to want a label.

-It should be short and as non-esoteric as possible. One word is ideal, two short words is probably maximum.

How about "Truth seeker"?

I think this implies a number of useful characteristics:

  • Willingness to listen

  • Lack of attachment to some particular truth or way as absolute

  • Willingness to be wrong/change one's mind

The main problem with "rationalist" is that instead of declaring one's goal, one seems to be claiming to have achieved a goal. To most people it seems arrogant. So I like "truth seeker" because it so clearly avoids that problem. Of course "truth seeker" also has the problem that it implicitly accuses everyone else of not seeking truth. Believe me, most people understand this slight and are not happy with it. So which insults other folks less - saying we love truth when they don't, or saying we are better at finding truth than they are?

Actually, a broader view of this seems useful. Any time we say "I am an X", the person you're talking to is likely to take it as you implying that they're not an X, unless they already identify as being an X as well. So any good-sounding X will come off as insulting. A bad-sounding but interesting X might be useful, but that seems prone to backfiring, and neutral values of X are both difficult to construct and not very stable.

Stating it as 'I do X' or even 'I do X well' seems more likely to be taken well - there's less of an intrinsic implication about whether the other person does X.

Unfortunately, the term is already taken. And I claim by people with too much confirmation bias.

reminiscent of 'philosopher'.

The problem is language. If you use a concept frequently, you pretty much need a shorthand way of referring to it.

But I would ask, do you need that concept – a concept for labeling this type of person – in the first place?

"Mate selection for the male who values the use of a properly weighted Bayesian model in the evaluation of the probability of phenomena" would not make a very effective post title. [as] "Mate selection for the male rationalist".

I don’t think that’s the only other option. Maybe it could’ve been called “Mate selection for rational male” or “Mate selection for males interested in rationality”.

I don’t see why it has to even make any mention of rationality. Presumably anything posted on Less Wrong is going to be targeted at those with an interest in rationality. Perhaps it could have been “Finding a mate with a similar outlook” or “Looking for relationship?”

I’m not suggesting that any of these alternatives are great titles, I'm just using them to suggest that there are alternatives.

"undeceiver"?

OK, not ideal, and slightly esoteric, but:

  • rolls of the tongue
  • parodies "unbeliever"
  • just odd enough to get attention without being impenetrable
  • has nice-ish connotations insofar as it implies that you won't deceive others, and thereby avoids exclusive focus on the self.
  • fits with the "less wrong" tradition of understatement: trying to be less (self-)deceived, rather than claiming truth

In thinking up a new term to replace an existing term with negative connotations, one should give some attention to how to avoid the euphemism treadmill.