I have several questions related to this:
- Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
- If so, can you suggest any easy steps we could take?
- Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
- Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
- Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
Here are the worst violators I see on that about page:
Some people consider the Sequences the most important work they have ever read.
Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.
Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]
And on the sequences page:
If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.
Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.
This is not what the posts are about, even if this works as one of the conclusions. The idea that urges and goals should be distinguished, for example, doesn't say what your urges or goals should be, it stands separately on its own. There are many such results, and ideas such as altruism or importance of FAI are only few among them. Do these ideas demonstrate comparatively more visible measurable effect than the other ideas?