I have several questions related to this:
- Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
- If so, can you suggest any easy steps we could take?
- Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
- Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
- Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Here are the worst violators I see on that about page:
Some people consider the Sequences the most important work they have ever read.
Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.
Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]
And on the sequences page:
If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.
I think the biggest reason Less Wrong seems like a cult is because there's very little self-skepticism; people seem remarkably confident that their idiosyncratic views must be correct (if the rest of the world disagrees, that's just because they're all dumb). There's very little attempt to provide any "outside" evidence that this confidence is correctly-placed (e.g. by subjecting these idiosyncratic views to serious falsification tests).
Instead, when someone points this out, Eliezer fumes "do you know what pluralistic ignorance is, and Asch's conformity experiment? ... your smart friends and favorite SF writers are not remotely close to the rationality standards of Less Wrong".
What's especially amusing is that EY is able to keep this stuff up by systematically ignoring every bit of his own advice: telling people to take the outside view and then taking the inside one, telling people to look into the dark while he studiously avoids it, emphasizing the importance of AI safety while he embarks on an extremely dangerous way of building AI -- you can do this with pretty much every entry in the sequences.
These are the sorts of things that make me think LessWrong is most interesting as a study in psychoceramics.
Your examples seem ... how do I put this ... unreliable. The first two are less examples and more insults, since you do not provide any actual examples of these tendencies; the last one would be more serious, if he hadn't written extensively on why he believes this to be the safest way - the only way that isn't suicidal - or if you had provided some evidence that his FAI proposals are "extremely dangerous". And, of course, airily proclaiming that this is tru... (read more)