I have several questions related to this:
- Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
- If so, can you suggest any easy steps we could take?
- Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
- Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
- Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Here are the worst violators I see on that about page:
Some people consider the Sequences the most important work they have ever read.
Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.
Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]
And on the sequences page:
If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.
I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.
For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.
AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.
There's also the problem of having a clearly defined leader, with strong controversial opinions which are treated like gospel. I would expect a community which discusses rationality to be more of an open debate/discussion between peers without any philosophical leaders that everybody agrees with. I don't see any easy solution here, because Eliezer Yudkowsky's reputation here is well earned- he actually is exceptionally brilliant and rational.
I would also like to see more articles on how to avoid bias, and apply bayesian methods to immediate present day problems and decision making. How can we avoid bias and correctly interpret data from scientific experiments, and then apply this knowledge to make good choices about things such as improving our own health?
Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.
Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).