fubarobfusco comments on An Outside View on Less Wrong's Advice - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (159)
To come up with a theory on the fly, maybe there are two modes of expansion for a group: by providing some service, and by sheer memetic virulence. One memetic virulence strategy operates by making outlandish promises that subscribing to it will make you smarter, richer, more successful, more attractive to the opposite sex, and just plain superior to other people - and then doing it in a way that can't obviously be proven wrong. This strategy usually involves people with loads of positive affect going around telling people how great their group is and how they need to join.
As a memetic defense strategy, people learn to identify this kind of spread and to shun groups that display its features. From the inside, this strategy manifests as a creepy feeling.
LW members have lots of positive affect around LW and express it all the time, the group seems to be growing without providing any services (eg no one's worried when a drama club grows, because they go and put on dramas, but it's not clear what our group is doing besides talking about how great we are), and we make outrageously bold claims about getting smarter and richer and sexier of the sort which virulent memes trying to propagate would have to make.
I don't think this creepiness detector operates on the conscious level, any more than the chance-of-catching-a-disease detector that tells us people with open sores all over their body are disgusting operates on a conscious level. We don't stop considering the open sores disgusting if we learn they're not really contagious, and we don't stop considering overblown self-improvement claims from an actively recruiting community to be a particularly virulent memetic vector even if we don't consciously believe that's what's going on.
(I'm still agnostic on that point. I'm sure no one intended this to be a meme optimized for self-propagation through outlandish promises, but it's hard to tell if it's started mutating that way.)
I'd like to know where all this LW-boasting is going on. I don't think I hear it at the meetups in Mountain View, but maybe I've been missing something.
Darnit, I don't like being vague and I also don't like pointing to specific people and saying "YOU! YOU SOUND CULTISH!" so I'm going to have a hard time answering this question in a satisfying way, but...
Lots of people are looking into things like nootropics/intelligence amplification, entrepreneurship, and pick-up artistry. And this is great. What gives me the creepy vibe is when they say (more on the site than at meetups) "And of course, we'll succeed at these much faster than other people have, even though they are professionals in this field, because we're Rationalists and they weren't." Anything involving the words "winning", "awesomeness", or gratuitous overuse of community identification terms like "primate" or "utility".
Trying to look for examples, I notice it is a smaller proportion of things than I originally thought and I'm probably biased toward overcounting them, which makes sense since in order to justify my belonging to a slightly creepy group I need to exaggerate my opposition to the group's creepiness.
Nonetheless, perhaps we need to adopt a new anti-cultishness norm against boasting about the predicted success of rationalists; or against ascribing personal victories to one's rationality without having actually done the math to demonstrate the correlation between success and rationality. The cult attractor is pretty damn bad, after all, and ending up in it could easily destroy one hell of a lot of value.
This is a great idea!