Daniel_Starr comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong

29 Post author: John_Maxwell_IV 15 March 2012 12:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (247)

You are viewing a single comment's thread.

Comment author: Daniel_Starr 22 March 2012 01:30:04PM *  3 points [-]

Ironically, I suspect the "cultlike" problem is that LessWrong/SI's key claims lack falsifiability.

Friendly AI? In the far future.

Self-improvement? All mental self-improvement is suspected of being a cult, unless it trains a skill outsiders are confident they can measure.

If I have a program for teaching people math, outsiders feel they know how they can check my claims - either my graduates are good at math or not.

But if I have a program for "putting you in touch with your inner goddess", how are people going to check my claims? For all outsiders know I'm making people feel good, or feel good about me, without actually making them meaningfully better.

Unfortunately, the external falsifiability of LW/SI's merits is more like the second case than the first. Especially, I suspect, for people who aren't already big fans of mathematics, information theory, probability, and potential AI.

Organization claims to improve a skill anyone can easily check = school. Organization claims to improve a quality that outsiders don't even know how to measure = cult.

If and when LW/SI can headline more easily falsifiable claims, it will be less cultlike.

I don't know if this is an immediately solvable problem, outside of developing other aspects of LW/SI that are more obviously useful/impressive to outsiders, and/or developing a generation of LW/SI fans who are indeed "winners" as rationalists ideally would be.

Comment author: roryokane 24 May 2014 06:18:57AM 1 point [-]

PredictionBook might help with measuring improvement, in a limited way. You can use it to measure how often your predictions are correct, and whether you are getting better over time. And you could theoretically ask LW-ers and non-LW-ers to make some predictions on PredictionBook, and then compare their accuracy to see if Less Wrong helped. Making accurate predictions of likelihood is a real skill that certainly has the possibility to be very useful – though it depends on what you’re predicting.