I've been lurking here a bit, and am trying to understand what people here mean by rationalism.  Many articles here seem to refer to discussion participants as rationalist while meaning very seemingly-different things, including intelligent, socially awkward, well-educated, and unencumbered by education.  I'm trying to make a little more sense of the word/concept.

Surely it does not refer to rationalist in the empiricism/rationalism divide, because it doesn't seem to be used in quite that way.

 

New Comment
15 comments, sorted by Click to highlight new comments since:
[-][anonymous]00

I would add to this that (it seems to me) what most people here mean when they refer to a rationalist is someone who consciously tries to apply their knowledge of epistemic rationality to generate their beliefs, and someone who consciously tries to apply their knowledge of instrumental rationality to achieve their values.

(That's opposed to someone who has a "textbook" knowledge of the techniques of rationality, but doesn't actually call upon it when making decisions.)

Anyone using the word "rationalist" here to imply "intelligent, socially awkward, well-educated, and unencumbered by education" is just being extremely sloppy with their words.

[-]ata141

What Do We Mean By 'Rationality'? (as linked below by saturn) is a good place to start, and it makes an important point: "We are not here to argue the meaning of a word, not even if that word is 'rationality'." (Or "rationalism".) That page describes the basic things-we're-trying-to-do on Less Wrong, and "rationality" is the shorthand we use to refer to those things; roughly speaking, a "rationalist" is a person who aspires to improve both their epistemic rationality and instrumental rationality (as described on that page), particularly by improving their awareness of the threats posed to their rationality by things that human brains do by default, such as the many cognitive biases and fallacies.

(And nobody here will claim that that's the One True Meaning of "rationalism"; again, with the conceptual associations it has here, it should be taken mostly as local shorthand. Many of us would argue that it's among the best of all the various things people have called "rationalism", but we know quite well that this is a claim that cannot be supported just by giving a definition.)

Off the top of my head, a few more interesting discussions on the subject of rationality as an "ism":

[-]prase130

I doubt "socially awkward" is used as a part of a definition of "rationalist" by anybody.

The word "rationalist" is thrown around here a lot in various contexts. There have been complaints about this before, though I am not quite sure how to locate them quickly (obviously searching the site for "rationalist" turns up quite a few results.) Here is an old discussion.

Here are a couple:

  • A "rationalist" could be someone who regularly reads and/or participates on this site. A clearer term might be "LessWronger." This category of people is often intelligent, socially awkward, well educated, and has a particular techno-liberaltarian slant. Things vary but these are good guesses.
  • A "rationalist" could be someone who believes in the virtues of rationality and wishes to improve themselves. This seems like an appropriate noun form of rationalist to me, although "aspiring rationalist" is a nice term I've heard.
  • A "rationalist" could be someone with a specific goal, which they are concerned with pursuing as effectively as possible. A clearer term might be "rational actor."
  • "Rationalist" is often used as an adjective describing arts, such as in "rationalist!fanfiction." In this context I usually interpret it to mean arts which glorify the virtues of rationality, by having heroes succeed and be awesome via planning and cross-domain thinking.

As for the statement "unencumbered by education," there are two possible interpretations I have: First, the man who started the site, Eliezer Yudkowsky, is an autodidact. References to knowledge without schooling may refer indirectly to him. Second, there is a significant and hopefully healthy amount of disrespect for authority on this site; including for academia and modern schools.

I am about 80% confident that someone has already composed this sort of a list, and that theirs is better than mine.

(I prefer to avoid the term 'rationalism' whenever possible and stick to 'rationality'. Mostly because 'rationalism' has been taken already, we have largely opposing views to that philosophy, and '-isms' should be kept narrow lest they become tainted by their weakest points/members. Buddhism, Hinduism, and libertarianism, for examples.)

How is Buddhism tainted? Christianity could have been tainted during the purges in the early centuries, but I don't find Buddhism to have deviated from its original teachings in such a way that Buddhists themselves would no longer recognize them. There are millions of Buddhists in the world, so there are bound to be weirdos in that lot. But consider the question: "What is Buddhism, as defined by prominent Buddhists themselves, whose prominence is recognized by traditional institutions that uphold Buddhism?" It doesn't seem to me the answer to this question would have changed much during the last millennium.

Likewise, rationalism could not be tainted by some Christian preacher who claims he is a rationalist, but whose preaching implicitly oppose rationalism and who is not considered a rationalist by anyone on LW.

But consider the question: "What is Buddhism, as defined by prominent Buddhists themselves, whose prominence is recognized by traditional institutions that uphold Buddhism?" It doesn't seem to me the answer to this question would have changed much during the last millennium.

This chap would disagree. There's rather a lot of words there, so briefly: Buddhism in the Western world -- what he calls "Consensus Buddhism" -- is for the most part an invention of the 19th and 20th centuries with more roots in European and American culture than in the countries it came from.

Good question. But it should be answered by someone who has been here longer than I.

You are correct that it doesn't refer to rationalism vs empiricism - we tend toward the empiricist side of that debate. It is not really a philosophical position - more of an aspiration.

For me, rationalism corresponds to what I call "correct reasoning". Correct reasoning is any reasoning you would eventually perform if your beliefs were continually and informatively checked against your observations, starting from a belief set of arbitrarily large wrongness.

For example, if you believed that you should observe {X} as a result of employing reasoning mechanism {Y}, and you happened to get good tests of {Y} (i.e., highest possible surprisal value, -log p({X}|{Y}) ), forcing you to use a different {Y} until you found a {Y} that correctly predicted {X}, then the reasoning mechanism in {Y} is what I count as "correct reasoning".

Starting with this general principle, one can derive several heuristics to use when forming useful models of the world, and these form the ontology assumed by CLIP (the Clippy Language Interface Protocol).

[-]taw10

What if you couldn't distinguish between two different reasoning mechanisms by any finite amount of observation, but they led to completely different conclusions?

The universe in which at some date in the future every paperclip turns into a non-paperclip, and every non-paperclip turns into a paperclip, would look just like the universe where no such thing ever happens.

And there are infinitely many such switching universes - one for each switching date - and only one non-switching universe. So even if they seem unlikely, this should be balanced by their numbers.

Are you willing to take the risk that all your effort to make more paperclips will lead to fewer paperclips because you simply assumed how universe works?

Nice try, but correct reasoning implies a complexity penalty because predicating my reasoning on arbitrary parameters would be filtered out quickly given informative observations.

[-]taw10

Is every paperclip just as important, or each additional paperclip matters less?

Is certain number of paperclips exactly as valuable as half the chance for twice as many paperclips?

You're saying "complexity penalty", but it is not that complex to describe 3^^^3 paperclips. Number of possible paperclips can increase a lot lot faster than complexity.