cousin_it comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (247)
AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.
I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:
&
Really, am I the only one seeing the problem with this?
People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they parrot out "LW looks kinda culty to me" or whatever. It's like people only want to convey pure connotation. Remember sneaking in connotations, and how you're not supposed to do that? How about, instead of saying "LW is a cult", "LW is bad for its members"? This is an actual message, one that speaks negatively of LW but contains more information than negative affective valence. Speaking of which, one of the primary indicators of culthood is being unresponsive or dismissal of criticism. People regularly accuse LW of this, which is outright batshit. XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong. Not to mention all the other posts (more) disagreeing with claims in what are usually called the Sequences, all highly upvoted by Less Wrong members.
The more people at Less Wrong naively wax speculatively on how the community appears from the outside, throwing around vague negative-affective-valence words and phrases like "cult" and "telling people exactly how they should be", the worse this community will be perceived, and the worse this community will be. I reiterate: I am sick to death of people playing color politics on "whether LW is a cult" without doing any of making the discussion precise and explicit rather than vague and implicit, taking into account that dissent is not only tolerated but encouraged here, remembering that their brains instantly mark "cult" as being associated to wherever it's seen, and any of a million other factors. The "million other factors" is, I admit, a poor excuse, but I am out of breath and emotionally exhausted; forgive the laziness.
Everything that should have needed to be said about this has been said in the Cult Attractor sequence, and, from the Less Wrong wiki FAQ:
Talking about this all the time makes it worse, and worse every time someone talks about it.
What the bleeding fuck.
LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.
You mean when he saw himself in the mirror? :)
Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?
I think it's not an ethical imperative unless you're unusually altruistic.
Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.
It's hard not to rewrite someone's goals while educating them, because one of our inborn drives is to gain the respect and approval of people around us, and if that means overwriting some of our goals, well that's a small price to pay as far as that part of our brain is concerned. For example, I stayed for about a week at the SIAI house a few years ago when attending the decision theory workshop, and my values shift in obvious ways just by being surrounded by more altruistic people and talking with them. (The effect largely dissipated after I left, but not completely.)
Presumably the people they selected for the rationality mini-camp were already more altruistic than average, and the camp itself pushed some of them to the "unusually altruistic" level. Why should SIAI people have qualms about this (other than possible bad PR)?
Pointing out that religious/cultic value rewriting is hard to avoid hardly refues the idea that LW is a cult.
I don't think "unusually altruistic" is a good characterization of "doesn't value personal preferences about some life choices more than the future of humanity"...
It doesn't sound like you know all that many humans, then. In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.
I was talking about the future of humanity, not the "future of humanity" (a label that can be grossly misinterpreted).
Do you believe most people are already quite altruistic in that sense? Why? It seems to me that many people give lip service to altruism, but their actions (e.g. reluctance to donate to highly efficient charities) speak otherwise. I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.
False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.
Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.
(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)
I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.
Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.
if prediction markets were legal, we could much more easily measure if LW helped rationality. Just ask people to make n bets or predictions per month and see 1) it they did better than the population average and 2) if they improved over time.
In fact, trying to get intrade legal in the US might be a very worthwhile project for just this reason ( beyond all the general social reasons to like prediction markets)
There is no need to wish or strive for regulatory changes that may never happen: I've pointed out in the past that non-money prediction markets generally are pretty accurate and competitive with money prediction markets; so money does not seem to be a crucial factor. Just systematic tracking and judgment.
(Being able to profit may attract some people, like me, but the fear of loss may also serve as a potent deterrent to users.)
I have written at length about how I believe prediction markets helped me but I have been helped even more by the free active you-can-sign-up-right-now-and-start-using-it,-really,-right-now http://www.PredictionBook.com
I routinely use LW-related ideas and strategies in predicting, and I believe my calibration graph reflects genuine success at predicting.
Very nice idea, thanks! After some googling I found someone already made this suggestion in 2009.
... or you estimate the risk to be significant and you want to live past the next N years.
I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.
How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?
"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."
I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.
I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?
If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.
Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.
I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.
I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...
— Nick Tarleton's twist on T.S. Eliot
I'm too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.
Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.
Pascal? Izzat you?
That comment doesn't actually argue for contributing to FAI development. So I guess I'm not Pascal (damn).
You probably don't wanna be Pascal anyway. I'm given to understand he's been a metabolic no-show for about 350 years.
I agree entirely. That post made me go "AAAH" and its rapid karma increase at first made me go "AAAAHH"