The following two paragraphs got me thinking some rather uncomfortable thoughts about our community's insularity:

We engineers are frequently accused of speaking an alien language, of wrapping what we do in jargon and obscurity in order to preserve the technological priesthood. There is, I think, a grain of truth in this accusation. Defenders frequently counter with arguments about how what we do really is technical and really does require precise language in order to talk about it clearly. There is, I think, a substantial bit of truth in this as well, though it is hard to use these grounds to defend the use of the term "grep" to describe digging through a backpack to find a lost item, as a friend of mine sometimes does. However, I think it's human nature for members of any group to use the ideas they have in common as metaphors for everything else in life, so I'm willing to forgive him.

The really telling factor that neither side of the debate seems to cotton to, however, is this: technical people like me work in a commercial environment. Every day I have to explain what I do to people who are different from me -- marketing people, technical writers, my boss, my investors, my customers -- none of whom belong to my profession or share my technical background or knowledge. As a consequence, I'm constantly forced to describe what I know in terms that other people can at least begin to understand. My success in my job depends to a large degree on my success in so communicating. At the very least, in order to remain employed I have to convince somebody else that what I'm doing is worth having them pay for it.

 - Chip Morningstar, "How to Deconstruct Almost Anything: My Postmodern Adventure"

The LW/MIRI/CFAR memeplex shares some important features with postmodernism, namely the strong tendency to go meta, a large amount of jargon that is often impenetrable to outsiders and the lack of an immediate need to justify itself to them.  This combination takes away the selective pressure that stops most groups from going totally crazy.  As far as I can tell, we have not fallen into this trap, but since people tend to fail to notice when their in-group has gone crazy, this is at best weak evidence that we haven't; furthermore, even assuming that we are in fact perfectly sane now, it will still take effort to maintain that state.

Based on the paragraphs quoted above, having to use our ideas to produce something that outsiders would value, or at least explain them in ways that intelligent outsiders can understand well enough to criticize would create this sort of pressure.  Has anyone here tried to do either of these to a significant degree?  If so, how, and how successfully?

What other approaches can we take to check (and defend) our collective sanity?

New Comment
78 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I'm unlucky enough to know a few postmodernists, and what I find most striking about them is that they try very hard to stay out of conflict with each other.

That makes sense because when they do argue, due to their lack of a clear method for assessing who (if anybody) is in the right, the arguments are unproductive, frustrating, and can get quite nasty.

So I don't think we're too similar to them. That said, the obvious way to check our sanity would be to have outsiders look at us. In order to do that, we'd probably have to convince outsiders to give a fuck about us.

3Algernoq
As an outsider, here are some criticisms (below). I've read all of HPMOR and some of the sequences, attended a couple of meetups, and am signed up for cryonics. But, I have little interest in reading more of the sequences and no interest in more in-person meetings. * Rationality doesn't guarantee correctness. Given some data, rational thinking can get to the facts accurately, i.e. say what "is". But, deciding what to do in the real world requires non-rational value judgments to make any "should" statements. (Or, you could not believe in free will. But most LWers don't live like that.) Additionally, huge errors are possible when reasoning beyond limited data. Many LWers seem to assume that being as rational as possible will solve all their life problems. It usually won't; instead, a better choice is to find more real-world data about outcomes for different life paths, pick a path (quickly, given the time cost of reflecting), and get on with getting things done. When making a trip by car, it's not worth spending 25% of your time planning to shave off 5% of your time driving. In other words, LW tends to conflate rationality and intelligence. * In particular, AI risk is overstated There are a bunch of existential threats (asteroids, nukes, pollution, unknown unknowns, etc.). It's not at all clear if general AI is a significant threat. It's also highly doubtful that the best way to address this threat is writing speculative research papers, because I have found in my work as an engineer that untested theories are usually wrong for unexpected reasons, and it's necessary to build and test prototypes in the real world. My strong suspicion is that the best way to reduce existential risk is to build (non-nanotech) self-replicating robots using existing technology and online ordering of materials, and use the surplus income generated to brute-force research problems, but I don't know enough about manufacturing automation to be sure. * LW has a cult-like social structure.
0AlexanderRM
If I may just focus on one of your critiques, I'd like to say that the thing about the cult-like structure... I'm not sure whether that actually results in the cult effect on LW or not, but the general idea both intrigues and terrifies me. Especially the "contempt for less-rational Normals" thing- I haven't noticed that in myself but the possibility* of that happening by itself is... interesting, with what I know of LW. I have almost never seen anyone on LW really condemning anyone specific as "irrational", except maybe a couple celebrities, or doing anything that could relate to actively urging others to several ties, but I have this image that individuals in LW could potentially often sever ties with people they see as less rational as a result, without anybody actually intending it or even realizing it. *or at least, my views of people who I perceive of as being less rational is pretty much unchanged from before LW, which is the important part. Especially in the case of social interaction, rather than discussing serious issues. It's possible I might be unusual compared to some nerds on this; I tend to not care too much whether or not the people I interact with are especially smart or even that our interactions are anything but vapid nonsense, as long as I enjoy interacting with them.
[-]9eB1110

Scott Alexander recently posted a link to this article which was very interesting. After reading it, the difference between postmodernism and LW rationality seems very large. It doesn't directly address your point, but you may find it interesting.

Separately, I think that you are exaggerating the tendencies LW shares with postmodernism. While LessWrongers love going meta (and they seem to love it even more in person than on the site), what you actually see in discussions here and on rationality blogs is requests to go in either the meta or object-level directions as required by the interlocutor. CFAR specifically has lessons on going toward object-level. Comparing the jargon of postmodernism and LessWrong is not really an equal comparison either. Postmodernism is oftentimes intentionally obscure, and sometimes redefines words to very surprising meanings (see the above linked article), while on LessWrong people seem to go to some pains to coin new language only when old language is insufficient, and explicitly consider what appropriate names would be (the major exception to this is perhaps language coined during the time of the sequences that is still widely used). LW doesn't have a ... (read more)

LessWrong people seem to go to some pains to coin new language only when old language is insufficient

The pains don't always stretch to learning philosophy, which EY hasn't done, and advises against, with the result that LW jargon in fact often does reinvent philosophical jargon.

39eB1
Of course, that's why I said "some pains" and not "great pains." People are aware of the issue and generally avoid it when it's easy to do so, or there will be comments pointing out that something is just a different name for an older term. Also, I excluded Eliezer's sequences and the resulting jargon for a reason.
1Emile
... but does so in a way that is probably more accessible to the average 21th century geek than the original philosophical jargon was, so it's not a great loss, because there are more geeks that don't understand philosophical jargon than philosophers who don't get geek references.

It is a great loss, because the original terms are nowhere to be seen. So if someone wants to read, say, non-amateur writing on the idea and its history, they're out of luck.

0Emile
I sorta agree - I guess it depends on how valuable it is to be able to read Philosophy; some (Lukeprog, Eliezer) seem to consider it mostly a waste of time, others don't, and I'm not really qualified to tell.
5David_Gerard
We're talking here specifically about the amateur philosophy, presented with neologisms as if it's original thought, when it simply isn't. You seem to be saying that it's valuable if EY writes about it but not if professional philosophers do - surely that's not what you mean?
8TheAncientGeek
It's a great loss because it prevents constructive dialogue between the two communuties. There is quite a lot that US broken in the sequences...not so much in terms of being wrong as in terms of being unclear, addressing the wring question etc...and it looks likely to stay that way.
3Emile
That was supposed to be "IS", right?
0Jayson_Virissimo
Yes, this is why I recommend that LWers read Robert Nozick.
0TheAncientGeek
Well, I like Nozick, but I like a lot of other people.as well.
8David_Gerard
Are you sure? One of the biggest problems with LW is inventing jargon for philosophical ideas that have had names for a couple of thousand years. This is problematic if the interested reader wants to learn more.
4Nornagest
Example? I believe you, but every time I've personally gone looking for a term in the philosophy literature I've found it.
-1David_Gerard
e.g. "fallacy of grey" is an entirely local neologism.
2Kaj_Sotala
What's the standard term?
2[anonymous]
It's a form of the continuum fallacy.
0David_Gerard
gwern holds that it's actually false balance. Might be a mix. But one or both should have been named IMO.
0[anonymous]
That's interesting. False balance doesn't seem to replace anything with a continuum. In particular I'm having trouble rephrasing their examples as fallacy of grey examples. But, eh, I trust gwern.
-1TheAncientGeek
Organisation A can be like organisation B in every way except their doctrine. It has been remarked, not least by rationalwiki that LW is like Any Rand's Objectivism, although doctrinally they are poles apart. It is perfectly possible for an organisation to pay lip service to outreach without making the changes and sacrifices needed for real engagement.
29eB1
With respect to the point that two organizations CAN be similar except in doctrine, I agree, but I don't think that's true for Less Wrong and postmodernism, hence my comment. I was directly addressing the points of comparison the poster argued for. If you are speaking of Objectivism the organization led by Ayn Rand rather than Objectivism the collective philosophy of Ayn Rand, the differences are pretty massive. Objectivism was a bona fide cult of personality, while the vast majority of people on Less Wrong have never met Eliezer and he no longer even engages with the site. Watch the first part of this interview and compare it with Less Wrong. Perhaps this could be argued specifically of the rationalists living in the Bay Area, but I don't know enough to say. The article on rationalwiki has been updated and now seems substantially fairer than it was when I last saw it a few years ago. It doesn't draw any direct comparison to Objectivism, and now says that the "appearance of a cult has faded." That said, I don't put much stock in their opinions on such things. It doesn't seem to me that people on Less Wrong merely place lip service on outreach (although once again we are certainly in agreement that such a thing is possible!). There seem to be a lot of posts on meetups here, advice on how to get new attendees, etc. Making "changes and sacrifices needed for real engagement" isn't straightforward in practice (and engagement isn't an unqualified good). You have to draw new members without betraying your core principles and without it becoming a place the existing members don't want to participate in.
1TheAncientGeek
Objectivism did and does have plenty of adherents who never met Rand. Personal contact isn't a prerequisite for a personality cult.
09eB1
It seems you are correct. I had a definition in mind for a cult of personality which was actually much narrower than what it actually means, upon looking it up. Nonetheless, so far you've implied a lot more than you've actually stated, and your arguments about "what is possible" are less interesting to me than arguments about "what is." Frankly, I find argumentation by implication annoying, so I'm tapping out.
0Luke_A_Somers
Quick question: how much do these doctrinal differences matter?
-2TheAncientGeek
Matter to whom? If you join that kind of organisation, you are probably looking for answers. If not, maybe not.

What other approaches can we take to check (and defend) our collective sanity?

Do rationalists win when confounding factors of intelligence, conscientiousness, and anything else we can find are corrected for?

Do they make more money? Have greater life satisfaction? Fewer avoidable tragedies? Reliably bootstrap themselves out of mental and physical problems?

I'm not sure what the answer is.

4Will_Newsome
I suspect the answer is "no". But I don't know why you would correct for intelligence &c. in your analysis. Attracting a group of intelligent people is kinda hard to pull off and of course many, many tradeoffs will be made to make it possible.
8[anonymous]
People who are doing well enough already won't be drawn to something with self-improvement as one of its de facto major selling points. If rationalists produce valuable memes, those memes are likely to enter popular culture and lose their association with rationalists. Who credits sociology for inventing the term "role model"?
1skeptical_lurker
This is probably true in general, but LW overlaps with H+ memes, and H+ is radical self improvement, meaning that LW might attract people who are doing well, but aspire to be doing even better. Besides, I think the people who look for self-improvement because they are not doing well would be more interested in e.g. tackling depression which is a small minority of LW content.

This reminds me of this SMBC. There are fields (modern physics comes to mind too) that no one outside of them can understand what they are doing anymore, yet that appear to have remained sane. There are more safeguards against postmodernists' failure mode than this one. In fact, I think there is a lot more wrong with postmodernism than that they don't have to justify themselves to outsiders. Math and physics have mechanisms determining what ideas within them get accepted that imbue them with their sanity. In math, there are proofs. In physics, there are ex... (read more)

The LW/MIRI/CFAR memeplex shares some important features with postmodernism, namely the strong tendency to go meta, a large amount of jargon that is often impenetrable to outsiders and the lack of an immediate need to justify itself to them.

Mathematics also has all of these. So I don't think this is a good argument that LW/MIRI/CFAR is doing something wrong.

Based on the paragraphs quoted above, having to use our ideas to produce something that outsiders would value, or at least explain them in ways that intelligent outsiders can understand well enough

... (read more)

Here is something that could be useful to have, but would require a lot of work and talent. As a side effect, it would solve the problem mentioned in the article:

Rewrite parts of the Sequences, for wider audience.

For example, the Bayesian math. Rewrite the explanation in a way that is easy to read for a high school student, without any LW lingo. A lot of pictures. Sample problems. Then debate the more complex topics, such as how you can never get 0 and 1 as a result of Bayesian updating, conservation of expected evidence, etc. Then distribute the book as p... (read more)

Insularity is always dangerous, and too much internal jargon can scare off outsiders. However, postmodernists are quite unlike the LW-community. For instance, postmodernists tend to be anti-scientific and deliberately obscurantist, as Alan Sokal showed by publishing a fake article in a postmodernist journal. Hence I don't think the analogy works very well.

-6TheAncientGeek

As far as I can tell, we have not fallen into this trap, but since people tend to fail to notice when their in-group has gone crazy

Given the amount of contrarians on LW that open discussions on whether or not LW is a cult, I don't really think we have a problem with lack of self criticism.

Based on the paragraphs quoted above, having to use our ideas to produce something that outsiders would value, or at least explain them in ways that intelligent outsiders can understand well enough to criticize would create this sort of pressure. Has anyone here tr

... (read more)
2TheAncientGeek
How much of EYs material has been retracted or amended under critique? AFAICT , the answer is none.
7bramflakes
Complexity bound and speed limit for evolution
4ChristianKl
The April Fool post of EY would be an example retracted from LW because of criticism. I still consider retractions to be a good metric for criticism. Not everyone thinks that honest mistakes should be retracted.
2Luke_A_Somers
He did solicit amendments for republication. Going back and changing old blog posts is considerably more... revisionist.
0Adele_L
IIRC, he retracted one of his earlier articles on gender because he doesn't agree with it anymore.
0AlexanderRM
On the subject of people opening discussions about whether LW is a cult, I'd like to suggest that while it is useful to notice, that metric alone is not enough to determine whether LW has become a cult: We could easily wind up constantly opening discussions about whether LW is a cult, patting ourselves on the back for having opened the discussion at all, and then ending the discussion. Incidentally on a somewhat unrelated note about cultishness, I don't know how other LWers feel about it, but when I personally think about the subject I feel a really, really strong pull towards concluding outright that LW is not a cult and calling it settled, both because it feels less scary and takes less work than having to constantly guard against cultishness (reading some of EYs writing on how cultishness is something that needs to be constantly guarded against terrified me). I doubt I'm the only one to feel that way so it's something I thought would be good to mention.

CFAR seems to be trying to be using (some of) our common beliefs to produce something useful to outsiders. And they get good ratings from workshop attendees.

3Stefan_Schubert
True. CFAR is anything but insular. Their (excellent) workshops are based on outside research and they do very well to reach out to outsiders. They have Slovic and Stanovich as advisors, Kahneman has visited them, etc.
0AlexanderRM
A couple questions- what portion of the workshop attendees self-selected from among people who were already interesting in rationality, compared to the portion that randomly stumbled upon it for some reason? And even if it were from outsiders... I suppose that guards against the specific post-modernist failure mode. I think the checking by having to explain to outsiders isn't the most important thing that checks engineering, though: the most important one is having to engineer things that actually work. So rationality producing people who are better at accomplishing their goals would be the ideal measure.
0Mestroyer
Don't know, sorry.

or at least explain them in ways that intelligent outsiders can understand well enough to criticize

Based on feedback, I think I achieved that through my "Smarter than Us" booklet or through the AI risk executive summary: http://lesswrong.com/lw/k37/ai_risk_new_executive_summary/

0Kaj_Sotala
What's the outsider feedback on those been like?
0Stuart_Armstrong
Quite positive, but scarce.

Insularity in this case is simply a case of long inferential distances. It seems like senseless noise to the outside because that's what compressed information looks like to all who dont' have a decoder.
Every group that specializes on something falls into that, and it's healthy that it does so. But we should want a PR office only if we would want to sell our worldview to others, not to check our own sanity.

My understanding is that postmodernists face career incentives to keep the bullshit flowing. (To change my mind on this, find me an online community of enthusiastic amateur postmodernists who aren't trying to make it in academia or anything.)

8David_Gerard
Critics. Art, literary, music. Postmodernism is largely art criticism purporting to take everything as a text.
0RomeoStevens
That's the most succinct explanation of post modernism I've seen.
-1David_Gerard
This is why anyone who knows anything about postmodernism looks at science fans' straw postmodernism and goes "wtf". It turns out a set of paintbrushes doesn't make a good hammer, well gosh.
-2AlexanderRM
...could you clarify what you mean by "science fans' straw postmodernism"? I think "straw postmodernism" would generally imply that the science fans in question had invented the idea specifically to make fun of postmodernism (as a strawman). From the context however I get the impression that the science fans in question are themselves postmodernists and that you used the term "straw" to mean something like "not what postmodernism was intended to be". (also to the earlier post, come to think of it: are there online communities of enthusiastic amateur art critics who aren't trying to make it in any career? I honestly don't know myself, there could easily be.)
1[anonymous]
It could be argued that the neoreactionaries are an example. (Moldbug especially.)
3ChristianKl
You can criticise neoreactionaires on many fronts but they aren't postmodernists.
3TheAncientGeek
In style or substance...and which is more important...to them?
1ChristianKl
Postmodernism is a certain philosophy developed in the second part of the 20th century. I don't see how neoreactionaries subscribe to that philosophy either in style or substance.
-2TheAncientGeek
Style=obscurationism.
0ChristianKl
If I put obscurationism in Google, it indicates that it has a history that's a lot older than postmodernism.
0TheAncientGeek
So?
0ChristianKl
It's not something specific to postmodernism, so it's not useful for deciding whether neoreactionism has something to do with postmodernism.
1TheAncientGeek
I can criticise neoreationaries for being as obscurantist as postmodernism.
0[anonymous]
No you can't -- unless you think postmodernists' obscurantism is a deliberate piece of institutional design.
0TheAncientGeek
Accidental obscutantism is excusable?
1[anonymous]
https://twitter.com/karmakaiser/status/427233616993599488 https://twitter.com/karmakaiser/status/427233789014597632 He's right: cladistics is genealogy. One of the most important conceptual tools of neoreaction is basically that thing Foucault did.
0ChristianKl
I have to admit that I don't have a good grasp on Foucault but is cladistics/genealogy that much different from what Marx did earlier when he wanted to analyse history?
1[anonymous]
Yes. edit: more on the contrast
0AlexanderRM
I honestly don't understand Postmodernism well enough to know if this is it (and not sure if it's even understandable enough for that), but I've encountered ideas that sound similar to what I've heard of post-modernism from undergraduate students in my college's philosophy club. Specifically there are several people with a tendency to say things along the lines of "but how do we really know what's real or what's not", "can we really trust our senses", etc. with regards to every single discussion that comes up, making it essentially impossible to come to any actual conclusions in any discussion. Although one of them did actually accept the idea of discussing what the world would be like if our senses were reasonably accurate, but not without pointing out what a huge assumption that was. (now, actually, I think it makes a lot of sense to talk about what facts and truth are occasionally, but being able to just say "X is true" when you have 99.9999% confidence of it is a fairly useful shorthand.) (another thing which I'm not sure is the same or not was one of the people in the club who said something about someone believing "things that are true for him", although I didn't discuss that enough to get any real understanding on what they meant by that. Nor do I actually remember the question that led to that or the discussion following it, I think the topic diverged. In fact I think it diverged into me asking if their attitude was postmodernism and them not having any better an understanding of postmodernism than I did.) Is that similar to post-modernist ideas? Because I honestly have no idea if it is or not, and would be interested in any insights from someone who knows what post-modernism is.

LW is the opposite of postmodernism. Plato's condemnation of sophists ("the art of contradiction making, descended from an insincere kind of conceited mimicry, of the semblance-making breed, derived from image making, distinguished as portion, not divine but human, of production, that presents, a shadow play of words") applies perfectly to postmodernists, who are just the umpteenth incarnation of the sophist virus.

0TheAncientGeek
So what's Analytical Philosophy.
1polymathwannabe
Analytical philosophy is the serious one.