Ironically, I suspect the "cultlike" problem is that LessWrong/SI's key claims lack falsifiability.
Friendly AI? In the far future.
Self-improvement? All mental self-improvement is suspected of being a cult, unless it trains a skill outsiders are confident they can measure.
If I have a program for teaching people math, outsiders feel they know how they can check my claims - either my graduates are good at math or not.
But if I have a program for "putting you in touch with your inner goddess", how are people going to check my claims? For all outsiders know I'm making people feel good, or feel good about me, without actually making them meaningfully better.
Unfortunately, the external falsifiability of LW/SI's merits is more like the second case than the first. Especially, I suspect, for people who aren't already big fans of mathematics, information theory, probability, and potential AI.
Organization claims to improve a skill anyone can easily check = school. Organization claims to improve a quality that outsiders don't even know how to measure = cult.
If and when LW/SI can headline more easily falsifiable claims, it will be less cultlike.
I don't know if this is an immediately solvable problem, outside of developing other aspects of LW/SI that are more obviously useful/impressive to outsiders, and/or developing a generation of LW/SI fans who are indeed "winners" as rationalists ideally would be.
PredictionBook might help with measuring improvement, in a limited way. You can use it to measure how often your predictions are correct, and whether you are getting better over time. And you could theoretically ask LW-ers and non-LW-ers to make some predictions on PredictionBook, and then compare their accuracy to see if Less Wrong helped. Making accurate predictions of likelihood is a real skill that certainly has the possibility to be very useful – though it depends on what you’re predicting.
I have several questions related to this:
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Here are the worst violators I see on that about page:
And on the sequences page:
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.