You see, I've seen the word "rationalism" used to mean all five of these things at different times:

  • The belief that we should come to know the world through reason and experimentation, shunning intuition.
  • The belief that we should come to know the world through reason and intuition, shunning experimentation.
  • The belief that we should come to know the world through knowledge of (and correction for) cognitive biases, and knowledge of (and correct use of) probability theory.
  • Being effective at believing things that are true and not things that are false.
  • Being effective at doing things that are good and not things that are bad.
In most of the mainstream philosophy I've read, the word "rationalism" has been used, without qualification, to mean the second of these, even though that type of rationalism strongly contradicts the stuff we call rationalism! One of my friends has freely used the word "rationalism" in conversation, referring to "our" rationalism, completely unaware that, to most people, the word means something completely different. Another of my friends said that he "hates rationalism with a passion"—and I have no idea which of these five things is the one he hates!
Given that "rationalism" to most people (or, at least, most philosophers) means something utterly unlike what it means to us, perhaps calling our philosophy "rationalism" is about as wise as developing a political philosophy, based on socialism but with nationalist influences, and calling it "national socialism".
I suggest that we use the word "sensibilism" instead, since nobody else is using it, it seems unobjectionable, and I think it captures what we're all about.
Edited to remove a proposed solution.

Edited to reinstate that proposed solution, since this discussion is presumably finished.

New to LessWrong?

New Comment
65 comments, sorted by Click to highlight new comments since: Today at 6:59 PM

And all this time I thought we were talking about "able to be expressed as a ratio between two integers"!

This leads me to imagine promoting blatant woo by saying "transcendentalism covers so much more than your mere 'rationality'!" (I promise not to start selling woo with such a tagline.)

[-][anonymous]13y-10

Where's the glossary? I don't think I was active at the time we invented the word "woo". :)

Skeptics' jargon. "The term comes from "woo-woo", an epithet used in the 1990s by science and skeptical writers to ridicule people who believe or promote such things. This is in turn believed to have come from the use of "woooooo!" as a reaction to dimmed lights or magic tricks, and generally implies a lack of either intelligence or sincerity on the part of the person or concepts so described." - supposedly.

The Less Wrong consensus already seems to be using "rationality" instead of "rationalism." See Note on Terminology: "Rationality", not "Rationalism".

But people usually don't point out usage of "rationalism" as an error. We don't have a normative consensus yet.

[-][anonymous]13y30

My point is stronger than Vlad's: not only does "rationalism" have undesirable connotations, but it means the wrong thing.

Would the belief that rationality is good hence be "rationality-ism"?

Another of my friends said that he "hates rationalism with a passion"—and I have no idea which of these five things is the one he hates!

Hehehe...this reminds me of responses I get from people who I link to LW. On Twitter I recently got this one:

... what is the bloody point? Philosophizing to death?

A comment on one of my blogs:

Well I'd figured out most of the above just fine already without any help from LessWrong. Don't get me wrong, I'm sure it's a great resource but it's still just one collection of peoples ideas on the internet, not the single bright shining hope for the redemption of the humanities collective intellect.

I've seen & flagged some interesting stuff in the couple of hours I've already killed on there, but there's also a whole bunch of crummy anecdotes, academic grandstanding & incredibly boring logic arguments to wade through. I think it was folkert who had that Bertrand Russel quote up on his site about wishing thinking would come back in style, if that's ever going to happen then places like LessWrong need to figure out how to distill their subject matter and make it more palatable for the average person who really doesn't care about being part of the Bayesian in-crowd.

If that's something they're already discussing but I just haven't come across yet then by all means, link me up. It's far more interesting (imo) than reading about how someone flexed their superior rationality in order to one-up a god botherer at a dinner party. ;)

It is most often along the lines of interesting but useless and nothing new. Using 'rationalism' will appall even more, I already had a few conversations where people told me that they mainly perceive LW to be a place for geeks to collectively fill themselves with confidence.

interesting but useless and nothing new

I occasionally ponder what LW's objective place in the scheme of things might be. Will it ever matter as much as, say, the Vienna Circle? Or even just as much as the Futurians? - who didn't matter very much, but whose story should interest the NYC group. The Futurians were communists, but that was actually a common outlook for "rationalists" at the time, and the Futurians were definitely future-oriented.

Will LW just become a tiresome and insignificant rationalist cult? The more that people want to conduct missionary activity, "raising the sanity waterline" and so forth, the more that this threatens to occur. Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.

It could be that the dominant intellectual and personality tendencies here - critical and analytical - will prevent serious evangelism of either type from ever getting underway. So let's return for a moment to the example of the Vienna Circle, which was not much of a missionary outfit. It produced a philosophy, logical positivism, which was influential for a while, and it was a forum in which minds like Godel and Wittgenstein (and others who are much lesser known now, like Otto Neurath) got to trade views with other people who were smart and on their wavelength, though of course they had their differences.

Frankly I think it is unlikely that LW will reach that level. The Vienna Circle was a talking shop, an intellectual salon, but it was perhaps one in ten thousand in terms of its lucidity and significance. Recorded and unrecorded history, and the Internet today, is full of occasions where people met, were intellectually sympatico, and managed to elaborate their worldview in a way they found satisfactory; and quite often, the participants in this process felt they were doing something more than just personally exciting - they thought they were finding the truth, getting it right where almost everyone else got it wrong.

I appreciate that quite a few LW contributors will be thinking, I'm not in this out of a belief that we're making history; it's paying dividends for me and my peers, and that's good enough. But you can't deny that there is a current here, a persistent thread of opinion, which believes that LW is extremely important or potentially so, that it is a unique source of insights, a workshop for genuine discovery, an oasis of truth in a blind or ignorant world, etc.

Some of that perception I believe is definitely illusory, and comes from autodidacts thinking they are polymaths. That is, people who have developed a simple working framework for many fields or many questions of interest, and who then mistake that for genuine knowledge or expertise. When this illusion becomes a collective one, that is when you get true intellectual cultism, e.g. the followers of Lyndon Larouche. Larouche has an opinion on everything, and so to those who believe him on everything, he is the greatest genius of the age.

Then, there are some intellectual tendencies here which, if not entirely unique to LW, seem to be expressed with greater strength, diversity, and elaboration than elsewhere. I'm especially thinking of all the strange new views, expressed almost daily, about identity, morality, reality, arising from extreme multiverse thinking, computational platonism, the expectation of uploads... That is an area where I think LW would unquestionably be of interest to a historian of technological subcultural belief. And I think it's very possible that some form of these ideas will give rise to mass belief systems later in this century - people who don't worry about death because they believe in quantum immortality, popular ethical movements based on some of the more extreme or bizarre conclusions being deduced from radical utilitarianism, Singularity debates becoming an element of political life. I'm not saying LW would be the source of all this, just that it might be a bellwether of an emerging zeitgeist in which the ambient technical and cultural environment naturally gives rise to such thinking.

But is there anything happening here which will contribute to intellectual progress? - that's my main question right now. I see two ways that the answer might be yes. First, the ideas produced here might actually be intellectual progress; second, this might be a formative early experience for someone who went on to make genuine contributions. I think it's likely that the second option will be true of someone - that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn't yet exist, and it will be because this site warped their sensibility (or straightened it). But for now, my question is the first one: is there any intellectual progress directly occurring here, of a sort that would show up in a later history of ideas? Or is this all fundamentally, at best, just a learning experience for the participants, of purely private and local significance?

LW is nearly perfect but does lack self-criticism. I love self-criticism and I perceive too much agreement to be boring. One of the reasons why there is so much agreement here is not that there is nothing wrong but that people who strongly disagree either don't bother or are deterred by the reputation system. How do I know that? The more I read the more I learn that a lot of the basic principles here are not as well-grounded as the commitment of the community would suggest. Recently I wrote various experts in an effort to approach some kind of 'peer-review' of LW. I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer's. The overall opinion so far is not so much in favor of this community. Regarding the reputation system? People told me that it is one of the reasons why they don't bother to voice their opinion and lurk, but you could just read the infamous RationalWiki entry to get an idea of the general perception (although it improved since my comment here, which they pasted into the talk page). I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies. Another problem is the obvious arrogance here which is getting more distinct all the time. There is an LW versus rest of the world attitude. There is LW and then there are the irrational, ordinary people. That's just sad and I'm personally appalled by it.

Here is how some people described LW when I asked them about it:

...a lot of impressive-sounding jargon and slogans, and not everything they say is false and foolish, but in my view they've just sprinkled enough mathematics and logic over their fantasies to give them a veneer of respectability.

or

...they are naïve as far as the nature of human intelligence goes. I think they are mostly very bright and starry-eyed adults who never quite grew out of their science-fiction addiction as adolescents. None of them seems to have a realistic picture about the nature of thinking...

Even though I am basically the only person here who is often openly derogatory about this community, people seem to perceive it as too much already. I am apparently just talking about the same old problems over and over. Yet I've only been posting here since August 2010. The problems have not been fixed. There are problems like the increasing and unjustified arrogance, lack or criticism (let alone peer-review) and an general public relations problem (Scientology also gets donations ;-). But those problems don't matter. What is wrong and what will probably never change is that mere ideas are sold as 'laws' which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility. I think that is not only incredible scary but also causes distress in people who are susceptible to such thinking.

... this might be a formative early experience for someone who went on to make genuine contributions.

LW is certainly of great value and importance and I loved reading a lot of what has been written so far. I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.

Is there as much of a problem with the karma system as you make it out to be? I've posted comments critical of cryonics, comments critical of the idea of hard take off being likely, comments critical of Eliezer's writing style, and comments critical of general LW understanding of history of science. Almost every such comment has been voted up(and I can point to individual comments in all those categories that have been voted up).

I suspect that quality threshold for critical comments being voted up is higher than that for non-critical comments, and that the threshold is similarly more strict for low quality comments being likely to be voted down. But, that's a common problem, and in any event, high quality comments aren't often voted down. So, I fail to see how anyone would be substantially discouraged from posting critical comments unless they just weren't very familiar with the system here.

Yeah, this is my experience. I've posted lots of comments and even whole posts critical of Eliezer on this point or that point and have been upvoted heavily because I made my point and defended it well.

So I'm not sure the karma system makes it so you can't voice contrarian opinions. The karma system seems to enforce the idea that you defend what you say competently.

Case in point: Mitchell's heavily upvoted comment to which we are now responding.

It seems to me that the karma system needn't foster any actual intolerance for dissent among voters for it to have a chilling effect on dissenting newcomers. If a skeptical newcomer encounters the site, reads a few dozen posts, and notices that posts concordant with community norms tend to get upvoted, while dissonant ones tend to get downvoted, then from that observer's perspective the evidence indicates that voicing their skepticism would be taken poorly -- even if in actuality the voting effects are caused by high-visibility concordant posts belonging to bright and well-spoken community members and high-visibility dissonant posts belonging to trolls or random crackpots (who in turn have incentives to ignore those same chilling effects).

Without getting rid of the karma system entirely, one possible defense against this sort of effect might be to encourage a community norm of devil's advocacy. I see some possible coordination problems with that, though.

If the community norms are ones we don't endorse, then sure, let's overthrow those norms and replace them with norms we do endorse, in a targeted way. Which norms are we talking about, and what ought we replace them with?

Conversely, if we're talking about all norms... that is, if we're suggesting either that we endorse no norms at all, or that we somehow endorse a norm while at the same time avoiding discouraging contributions that violate that norm... I'm not sure that even makes sense. How is the result of that, even if we were successful, different from any other web forum?

I was trying to remain agnostic with regard to any specific norms. I'm not worried about particular values so much as the possibility of differentially discouraging sincere, well-informed dissent in newcomers relative to various forms of insincere or naive dissent: over time I'd expect that effect to isolate group opinion in ways which aren't necessarily good for our collective sanity. This seems related to Eliezer's evaporative cooling idea, except that it's happening on recruitment -- perhaps a semipermeable membrane would be a good analogy.

I tried a few times to question the reputation system here myself or ask if there are some posts or studies showing that such systems do subdue trolling but not at the price of truth and honesty, that reputation systems do not cause unjustified conformity. Sadly the response is often downvotes mixed with angry replies.

It would be nice if there were more studies about reputation systems. I think the anti-spam capability is pretty obvious, though.

We will be seeing more reputation systems in the future - it is pretty good that this site is trying one out, IMHO.

Is the group-think here worse than it was on OB or SL4? Not obviously. IMO, the group think (which, incidentally, I would agree is a systematic problem) is mostly down to the groupies, not the karma system.

I would never suggest that LW is junk but as long as it has the slightest problem with someone coming here and proclaiming that you are all wrong then something is indeed wrong.

Not really - the internet is full of nonsense - and sometimes it just needs ignoring.

I got replies from people as diverse as Douglas Hofstadter, Greg Egan, Ben Goertzel, David Pearce, various economists, experts and influencer's.

Cool! You posted some material from Ben - but it would be interesting to hear more.

Ben made some critical comments recently. Douglas Hofstadter has long been a naysayer of the whole area:

If you read Ray Kurzweil's books and Hans Moravec's, what I find is that it's a very bizarre mixture of ideas that are solid and good with ideas that are crazy. It's as if you took a lot of very good food and some dog excrement and blended it all up so that you can't possibly figure out what's good or bad. It's an intimate mixture of rubbish and good ideas, and it's very hard to disentangle the two, because these are smart people; they're not stupid.

Greg Egan wrote a book recently parodying the SIAI. David Pearce has some very different, but also prettty strange ideas of his own. So, maybe you are picking on dissenters here.

What is wrong and what will probably never change is that mere ideas are sold as 'laws' which are taken seriously to a dangerous degree by some individuals here. This place is basically breeding the first group of rationalists committed to do everything in the name of expected utility.

That is not obviously wrong. That's just down to the model of a rational agent which is widely accepted around here. If you have objections in this area, I think they need references - or some more spelling out.

LW is nearly perfect

Seriously?

The only way this statement could be true is if the question you asked is so specifically qualified that it loses all meaning.

Also even asking that question is epistemically dangerous. Ask a specific meaningful question like "Does the LW community improve the career of college students in STEM majors". Pose a query you can hug.

The cults are always wrong, and not just a little wrong, but systematically wrong, with whole anti-epistemologies developing to bear the load. LW rationality activism can be eccentric, in the sense of paying attention to things that most folk don't see as particularly interesting or important, but it's interesting where a social movement would go, whose primary directive is to actually protect itself from error (instead of just keeping up the appearances) and to seek better methods for doing so. This is almost unexplored territory, if you don't count global scientific community as a preceding example (one point, hard to draw generalizations from).

I believe we are significantly below a critical mass where the community becomes unlikely to fizzle out into obscurity, but given the non-arbitrary focus of the activism, it's possible that the movement will persist, or get reincarnated elsewhere.

Rationalist evangelism from LW might take two forms, boring and familiar, or eccentric and cultish. The boring and familiar form of rationalist evangelism could encompass opposition to religion, psych 101 lectures about cognitive bias, and tips on how optimism and clear thinking can lead to success in mating and moneymaking. An eccentric and cultish form of rationalist evangelism could be achieved by combining cryonics boosterism, Bayes-worship, insistence that the many-worlds interpretation is the only rational interpretation of quantum mechanics, and the supreme importance of finding the one true AI utility function.

I disagree with the connotation that the latter constitute a "dark side" of Less Wrong and that if you take them seriously and try to persuade people of them you're being cultish.

Nevertheless, it's very easy to be perceived as such. I try hard not to be cultish, and I think succeed. But I fail hard at not sounding cultish.

It doesn't even take cryonics. Even talking about atheism with my (agnostic) Mom was enough. Stating "I don't believe in God" was acceptable but "God doesn't exist, and those who believe it does are mistaken" was cultish. Such a level of confidence simply isn't allowed. And when I tried to talk about immortality and uploading, she was convinced I had a problem.

People will see the "dark side" in Less Wrong even though there is none.

I think it's likely that the second option will be true of someone - that at least one, and maybe several people, who are contributing to this site or just reading it, will, years from now, be making discoveries, in psychology or in some field that doesn't yet exist, and it will be because this site warped their sensibility (or straightened it).

But it is also likely that there is someone out there who will be effected negatively by this site. Your statement is only slightly relevant to the question of whether LessWrong is overall a positive influence. In other words, it's rhetorical dark arts.

But it is also likely that there is someone out there who will be effected negatively by this site.

You mean, someone who would have made a positive contribution, but they were intellectually (or otherwise) sidetracked by what they read here? That hadn't occurred to me. (Which is why my statement wasn't "rhetorical dark arts" - I genuinely didn't think of that possibility; I only thought in terms of "LW leads somewhere" or "LW leads nowhere".)

Which is why my statement wasn't "rhetorical dark arts"

I apologize, I did not mean to give the connotation of malice on your part, merely danger for readers.

I enjoyed reading this post, in no small part due to the narcissistic pleasure of discussing a small community I am (to some degree) a part of. If there was some option to split a comment into a thread it seems like an ideal use would be found here.

At the very least, lesswrong provides a fairly high quality forum for discussion of topics appealing to nerdy types. Similar communities include stackexchange and math/science blogs, which are 'harder' than lesswrong; sites like reddit and xkcd forums tend to be on the 'softer' side of the spectrum. Lesswrong, so far, is the best open forum for the discussion of 'futurist' issues.

What lesswrong lacks in comparison to 'harder' sites is a broad base of subject specialists. The scope of discussion on lesswrong is quite broad; however, the base of expertise on lesswrong is very narrow as it consists mainly of SIAI members. It would be hard to argue that lesswrong would not benefit from the active participation of more experts from domains relevant to its interests: economics, psychology, computer science, mathematics, statistics. However, up to now, LW has attracted few such users, due perhaps to its low profile and the fact that the core members of the community do not seem to prioritize subject expertise. Yet until LW has that kind of userbase, it seems unlikely any high impact developments will arise from activity on LW (excluding efforts from SIAI members themselves.) In contrast, mathoverflow.net seems like precisely the recipe for combining expertise through the internet for the advancement of the sciences.

Perhaps what is more important is the emergence of the lesswrong "rationalist" subculture itself. Future historians might lump this subculture with the larger "atheism" subculture, which has much in common with the LW community in terms of demographic composition. What would be much more interesting is if the LW community grew to incorporate a much more demographically diverse userbase.

I would say lesswrong has had a moderate impact on my intellectual development since I started reading it as a college student. It was satisfying to see that others (such as Yudkowsky) were able to notice "what was wrong with philosophy" and in fact, this allowed me to divert my attention to preparing for statistics; on the whole, I probably would have spent more time arguing about various issues on the internet if Yudkowsky had not already argued those points (and probably much better than I could have.) Lesswrong/OB did not alert me to concerns about artificial intelligence (I was already thinking about them before encountering this site) and so far it has not succeeded in dissuading me from intending to do research that may contribute to the eventual development of artificial general intelligence.

Where do you link them to? The Sequences are still the most stellar content here. The day to day content, being whatever its posters happened to feel like posting that day, cannot be expected to form a whole that is either coherent or of generally high quality. Like any discussion forum, it is primarily of interest to people who are already interested in its subject.

Suggestion for solution: interested parties taboo "rationality" and privately write a (roughly) three sentence summary of just what it is we think we're about. At a later time, interested parties post their summaries, and then try to pick out common themes and keywords to identify a unique label that doesn't have the name collision problem identified in the OP.

And then, about three minutes later, we'd have to taboo said common keywords and then write a three sentence summary of said keyword and present it to each other in order to pick out common themes and keywords, and repeat this process over and over...and not being able to actually talk about what we were supposed to talk about.

Can't we just argue over what is being done in the name of rationality, rather than arguing over the word "rationality" itself?

Well, you won't have to worry about it unless the grandparent hits 5 karma points, that being the threshold I had privately decided on for doing a discussion post following up on the idea. But if it does come to that, care to state your personal odds for the occurrence of your predicted regress?

Throwing up numbers would just be pointless guessing for the sake of pointless guessing, but because of the tight-knit nature of the blog in question (meaning that people here tend to share the same ideas and values concerning rationality), I'd say that the chances of regression for the LessWrong community would be significantly lower than the chances of regression for two random individuals.

Throwing up numbers would just be pointless guessing for the sake of pointless guessing

If you do it systematically, picking odds and tracking them can help with your calilbration. Doing it in public helps because it puts a little bit of status on the line, which makes one care about getting it right.

We've had some discussions on the term "rationality" before. I agree that having a bunch of empiricists going around calling themselves "rationalists" is very unfortunate given the prevalence of the rationalism vs. empiricism debate, and am open to the idea of a replacement term.

However, rather than discussing the merits of any specific alternatives in this regard, I believe it is best that we hold off on proposing solutions and discuss the problem first. In particular, we should try to come up with some specifications for any new term to satisfy (such as being free of confusing double meanings), and consider not switching names at all if the benefits do not outweigh the transition costs (not the least of which would be convincing the entire community to switch).

EDIT: Removed mention of proposed solution.

However, rather than discussing the merits of (EDIT: Removed) in this regard, I believe it is best that we hold off on proposing solutions and discuss the problem first.

Too late.

[-][anonymous]13y30

It's not too late for me to edit my post!

[-][anonymous]13y20

Yes, good idea. Here are some things I can think of:

  • It should be free of confusing double meanings.
  • It should not be silly.
  • It should be suggestive of what we're about.

Any other thoughts?

From a marketing perspective, when introducing something novel it is frequently a good idea to name things descriptively, so that if someone lacks the right keywords but has a felt need for "your thing" they find you anyway because you already use those keywords in your name or associated description. This is potentially a project in itself because you have to do market research and figure out what inferentially distant potential "customers" think, when they think about your stuff.

Looking at the tag cloud for the discussion area I would say that what we actually do in the discussion area is "talk and share links about psychologically rational meta-philosophy concerned with artificial intelligence and the singularity". Based on the tag cloud, the front page is for "meetup scheduling and discussion of a bayesian moral meta-philosophy inspired by science and artificial intelligence, with specific concern for the psychology of self deception and akrasia".

Another possible angle: Sometimes it's useful to identify and reclaim the name applied to you by your critics, who are likely to have a good handle on what about you is distinctive enough to object to. (I've heard that Christian sects like the Puritans did this sort of epithet reclamation back in the 1600's.) If that's a useful strategy we might impatiently adopt Jaron Lanier's term "cybernetic totalism" or if we have more patience then wait until someone else comes up with a more piercing criticism :-)

If we follow Lanier's term maybe we could be said to be a community of cybernetic totalists dedicated to refining the human art of "cybernetic efficacy"?

Sometimes I actively appreciate the fuzziness of the status quo. Sometimes if you have a stable name for something it causes people to go crazy about it in specific ways and imagine it has an "essence" that grows out of its dictionary definition and then you have this whole stupid conversation about words instead of talking about regularities within the world and expectations about them. The name of the site is "less wrong" and that phrase pretty hard hard to go wrong with. The stoics were named after the fact that they hung out around a specific building in Athens called the "Stoa Poecile". Maybe instead we could just talk about "that which is praised and cultivated by LessWong commenters", and let the substantive claims dominate the interpretative processes rather than have the pre-existing dictionary definition of a term appropriated by us for discussion purposes?

I agree with the first and third bullet points, but could you please elaborate on what you mean by the second one?

Also, I would like to add that we usually divide rationality into epistemic rationality and instrumental rationality, as described in What Do We Mean By "Rationality"? So whichever name we pick, it has to describe both branches, which is tricky. Maybe we should ask ourselves what the connection is between the two, and focus on that?

[-][anonymous]13y00

I agree with the first and third bullet points, but could you please elaborate on what you mean by the second one?

Just that we shouldn't call it "uber-better-than-everyone-else-ism" or the "FindTheRightAnswer™ method" or, you know, something silly like that. (Though what I actually had in mind when I wrote that bullet point was Francis Bacon's "idols of the marketplace" and the like.)

I'm reminded of Peirce's invention of the term pragmaticism, as distinct from "pragmatism". For that matter, there are significant points of similarity between LW-type rationality and pragmaticism, such as the notion of belief being evidence-based anticipation of future experience.

Oh my..... this is the first time I've had the words stolen out from under my fingers before I could post them. Hmm, and beaten to the punch on bringing up Pierce to boot.

There's also this term "bayesean" that's been floating around. Apparently it means "I try to change my mind when I am wrong" or something like that.

"Bayesian" has an analogous disadvantage in that it already has a technical meaning. I don't think it would be much of an improvement.

For my own part, I endorse being more precise and eschewing ambiguous labels altogether. I don't usually find it too difficult to talk about being explicit about the evidence and reference classes supporting my conclusions, or accepting counterarguments without becoming emotionally defensive, or whatever else I happen to be talking about.

But others' mileage clearly varies.

Good posting and good discussion. But I think it misses one important thing about the nature of the LessWrong experience and community. We are not just 'into' rational belief and rational action. We are most importantly into rational discussion and rational communication.

I would add to your list of the five meanings of 'rational' above:

  • Being effective at discussing things in ways that are fruitful and productive, and not in ways that waste time and lead nowhere.

While one or more items from your list of five might be taken as the thing that we hope characterizes us as rational individuals, I think it is my sixth item that characterizes us as a rational community.

I'm very leery about using terms like "LessWrongism" or "Yudkowskian" anything. If the problem with "rationalism" is that people won't know what it means, the above two are worse. People won't know what it means and it will make us sound like a cult.

The problem with "rationalism" is not that people don't know what it means, it's that it means something different to most people then it does to us. With terms like "LessWrongism" or "Yudkowskian", at least people will realize that they don't know what they mean.

The rationalism-empiricism philosophical debate is somewhat dead. I see no problem in using "rationalism" to mean LW rationalism. "Rationality" (1989) by Rescher defines rationality in the way LW uses the word, but doesn't use "rationalism", ostensibly because of the risk of confusion with the rationalism-empiricism debate. Neither LW nor average people are subject to similar limitations as the academic Rescher, so I think it is prudent to overwrite the meaning of the word "rationalism" now.

Maybe "rationalism" used to mean "rationalism in the rationalism-empiricsm debate", but the concept of "rationality" has become very important during the past century, and that "rationality" means the LW type rationality. Yet, "rationality" is only a method. What LW clearly advocates is that this method is somehow the best, the only right method, the only method, a superior method, or a method that ought to be used. Hence, LW is somewhat founded on a prescriptive belief that "rationality" is a good method. It is very reasonable to call such a belief "rationalism", as someone without belief in the superiority of rationality could still use rationality without being a rationalist.

I assume that last one comes from REBT?

I don't know how we should accommodate (or respond to) the popular and REBT uses of the term. The difference between our usage and the philosophical version seems less important. If we advocate the Bayesian view of probability as an extension of logic, then we could easily speak of Bayesian rationalism.

I mean, we'd have to explain why the term excludes circular 'Bayesian proofs' of Christianity, but in effect we have to do that anyway.

[-][anonymous]13y00

I've never heard of REBT, so I don't think anything in my post came from it.

Let's consider the possibility that labeling makes irrational disagreement easier. If you and I disagree about something, maybe that's because at most one of us is right. But once we learn that you're a Green and I'm a Blue, the discussion is over--we're just different stripes, that's all.

That said, if we're going for a term that has minimum transition costs associated with switching to it, why not technical rationality or quantitative rationality?They're just different enough from plain ol' rationality that they might prevent confusion.

"LessWrongism" is more likely to stick.

This community doesn't seem too vulnerable to claiming 'ours is what rationality REALLY means', which is the most harmful part of all this. So I wouldn't worry. If you keep forgetting others might use the term differently, it's a good sign that you might want to read other sources on connected issues.

This community doesn't seem too vulnerable to claiming 'ours is what rationality REALLY means'

YMMV. I see quite a bit of that sort of thing. (e.g., where it seems to me to be assumed that "rationality", let alone "rationalism", is what we do here and not what we don't do here.) It's a natural tendency.

Fair enough. I'm speaking relatively, and thinking in particular of various atheist forums where people would tell newcomers that they 'didn't know what 'atheism' meant' and similar in incredibly aggressive and/or patronising tones. Meant that if people turned up with a different understanding of how the vocab was used they were immediately alienated.

Rationality's a bit tougher because I think the primary sense is a fairly broad/informal one about truth or outcome maximisation. And so part of the argument is that this site is attempting the same thing as others but is doing better at achieving that thing.

I haven't actually seen someone turn up and talk about philosophical rationalism only to get a reply of the 'that's not what rationalism means, LOL' kind.

Ah yeah, speaking relative to other forums, I can see your point. It appals me sometimes how thick some people self-labeling as "atheist" can be.

Rationality and rationalism suffer the "what do you mean by that?" problem, yes. I think LW could do with more understanding that the local usage is quite explicitly LW's (or EY's) understanding of the word. Not that I consider this a big problem - and I expect the world will label the local variety in some unambiguous manner shortly after EY's rationality book comes out. (I think "Yudkowskian rationalism" is a hot favourite for what will become the philosophical term, clunky and guru-identifying as it sounds.)

I think if someone showed up hot on discussing the various philosophies that have been labeled "rationalism", they would get a gentle and polite "that's not how we use the term here" and a pointer to a relevant EY post or two.

(I can picture the stupid bad reviews of EY's book confusing multiple historical understandings of "rationalism" and assuming that's what it's for already. Like the reviews of The God Delusion only worse.)

Well, they are atheist in almost any of the likely or useful senses. And to be fair, a lot of the ones I found irritating were highly intelligent. The problem was a classic one for an awful lot of bright people. They'd been surrounded by people who were fairly blatantly irrational, or at least appeared so to a certain kind of mind that takes the religious claims literally. They'd seen these ideas as wrong, and not understood how anyone could hold them. Then they'd found a group of people who'd rejected the same things and so naturally made very strong emotional links to that as an in-group, and over-estimated its general rationality.

The thing that makes LW exceptional is that it could so easily be very like this, it has so many of the ingredients, but that there is a drive to focus in on the most difficult questions and to constantly challenge people parrotting in-group views. After a few weeks on here, I thought of some questions that would undermine the leading views on here, but would usually be laughed off as not worth thinking about (for instance, how utilitarianism works when part of the question is how many people exist) but they've been dealt with here more thoroughly than anywhere else I've seen.

That philosophers have a specific meaning for the term is the least of its problems.

Yes, people say "rationality" rather than "rationalism," but they say "rationalist." I think this evokes "rationalism" and in particular evokes "-ism" (contra Vladimir). But the "rational" part is pretty bad, too. Probably worse.

national socialism

Boo.

[-][anonymous]13y00

It's worth noting that this has come up before.

[-][anonymous]13y00

Finally someone said it.

In most of the mainstream philosophy I've read, the word "rationalism" has been used, without qualification, to mean the second of these

Yes, this is my experience too. To use the words "rationalism" or "rationalist" in any other sense is 'doing it wrong', and a recipe for confusion.

The belief that we should come to know the world through knowledge of (and correction for) cognitive biases, and knowledge of (and correct use of) probability theory.

That seems to be close to what LW is about. Yet the more I learn the more I am leaning towards experimentation and intuition. Probability is obviously correct when dealing with problems like the Monty Hall problem and endless other uses, yet it is almost unusable in daily life and breaks down given certain thought experiments. So whether you approach some lower and upper bounds you go with intuition to decide or actual experimentation to gather other kinds of evidence and use probability to verify the results.

[-][anonymous]13y10

Yes, I was trying to summarize what LW is about. I had intuition in mind as one of the two types of thinking, but I failed to make that clear.