You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Grognor comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong Discussion

29 Post author: John_Maxwell_IV 15 March 2012 12:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (247)

Sort By: Popular

You are viewing a single comment's thread.

Comment author: Grognor 15 March 2012 02:22:38AM *  46 points [-]

AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.

I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:

Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

&

Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Really, am I the only one seeing the problem with this?

People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they parrot out "LW looks kinda culty to me" or whatever. It's like people only want to convey pure connotation. Remember sneaking in connotations, and how you're not supposed to do that? How about, instead of saying "LW is a cult", "LW is bad for its members"? This is an actual message, one that speaks negatively of LW but contains more information than negative affective valence. Speaking of which, one of the primary indicators of culthood is being unresponsive or dismissal of criticism. People regularly accuse LW of this, which is outright batshit. XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong. Not to mention all the other posts (more) disagreeing with claims in what are usually called the Sequences, all highly upvoted by Less Wrong members.

The more people at Less Wrong naively wax speculatively on how the community appears from the outside, throwing around vague negative-affective-valence words and phrases like "cult" and "telling people exactly how they should be", the worse this community will be perceived, and the worse this community will be. I reiterate: I am sick to death of people playing color politics on "whether LW is a cult" without doing any of making the discussion precise and explicit rather than vague and implicit, taking into account that dissent is not only tolerated but encouraged here, remembering that their brains instantly mark "cult" as being associated to wherever it's seen, and any of a million other factors. The "million other factors" is, I admit, a poor excuse, but I am out of breath and emotionally exhausted; forgive the laziness.

Everything that should have needed to be said about this has been said in the Cult Attractor sequence, and, from the Less Wrong wiki FAQ:

We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all. Yes, some of the results that fall out of these basics sound weird if you haven't seen the reasoning behind them, but there's nothing in the laws of physics that prevents reality from sounding weird.

Talking about this all the time makes it worse, and worse every time someone talks about it.

What the bleeding fuck.

Comment author: epicureanideal 16 March 2012 02:48:30AM 9 points [-]

A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:

"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"

Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)

"LW is a cult hur hur"

People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.

"since it's exactly what Eliezer was trying to combat by writing it."

There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.

Comment author: epicureanideal 16 March 2012 02:54:26AM 2 points [-]

"I have become fully convinced that even bringing it up is actively harmful."

What evidence leads you to this conclusion?

Eliezer: Please, learn what turns good ideas into cults, and avoid it! Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

Can you provide evidence to support this characterization?

Eliezer: Do not worship a hero! Do not trust! Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Can you provide evidence to support this characterization?

I would like to see some empirical analysis of the points made here and by the original poster. We should gather some data about perceptions from real users and use that to inform future discussion on this topic. I think we have a starting point in the responses to this post, and comments in other posts could probably be mined for information, but we should also try to find some rational people who are not familiar with less wrong and introduce them to it and ask them for their impressions (from someone acting like they just found the site, are not affiliated with it, and are curious about their friend's impressions, or something like that).

Comment author: cousin_it 15 March 2012 07:06:38AM *  25 points [-]

LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.

Comment author: Gabriel 15 March 2012 07:17:30PM 3 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.

Comment author: [deleted] 07 August 2012 11:52:36PM 0 points [-]

Pascal? Izzat you?

Comment author: Gabriel 08 August 2012 09:04:41PM 0 points [-]

That comment doesn't actually argue for contributing to FAI development. So I guess I'm not Pascal (damn).

Comment author: [deleted] 08 August 2012 11:13:50PM 1 point [-]

You probably don't wanna be Pascal anyway. I'm given to understand he's been a metabolic no-show for about 350 years.

Comment author: Wei_Dai 15 March 2012 08:40:52AM 13 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person

You mean when he saw himself in the mirror? :)

Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?

Comment author: cousin_it 15 March 2012 11:02:10AM *  6 points [-]

I think it's not an ethical imperative unless you're unusually altruistic.

Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.

Comment author: Wei_Dai 16 March 2012 09:02:38PM 6 points [-]

It's hard not to rewrite someone's goals while educating them, because one of our inborn drives is to gain the respect and approval of people around us, and if that means overwriting some of our goals, well that's a small price to pay as far as that part of our brain is concerned. For example, I stayed for about a week at the SIAI house a few years ago when attending the decision theory workshop, and my values shift in obvious ways just by being surrounded by more altruistic people and talking with them. (The effect largely dissipated after I left, but not completely.)

I think it's not an ethical imperative unless you're unusually altruistic.

Presumably the people they selected for the rationality mini-camp were already more altruistic than average, and the camp itself pushed some of them to the "unusually altruistic" level. Why should SIAI people have qualms about this (other than possible bad PR)?

Comment author: TheAncientGeek 20 May 2014 04:12:56PM 0 points [-]

Pointing out that religious/cultic value rewriting is hard to avoid hardly refues the idea that LW is a cult.

Comment author: Vladimir_Nesov 15 March 2012 05:22:46PM 6 points [-]

I think it's not an ethical imperative unless you're unusually altruistic.

I don't think "unusually altruistic" is a good characterization of "doesn't value personal preferences about some life choices more than the future of humanity"...

Comment author: Vaniver 16 March 2012 10:59:57PM 4 points [-]

It doesn't sound like you know all that many humans, then. In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.

Comment author: Vladimir_Nesov 16 March 2012 11:08:02PM *  1 point [-]

In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.

I was talking about the future of humanity, not the "future of humanity" (a label that can be grossly misinterpreted).

Comment author: cousin_it 15 March 2012 05:38:33PM *  3 points [-]

Do you believe most people are already quite altruistic in that sense? Why? It seems to me that many people give lip service to altruism, but their actions (e.g. reluctance to donate to highly efficient charities) speak otherwise. I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

Comment author: Vladimir_Nesov 15 March 2012 05:57:12PM 7 points [-]

I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.

Comment author: cousin_it 15 March 2012 06:31:56PM *  8 points [-]

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)

Comment author: William_Quixote 07 September 2012 03:27:04AM 1 point [-]

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

if prediction markets were legal, we could much more easily measure if LW helped rationality. Just ask people to make n bets or predictions per month and see 1) it they did better than the population average and 2) if they improved over time.

In fact, trying to get intrade legal in the US might be a very worthwhile project for just this reason ( beyond all the general social reasons to like prediction markets)

Comment author: gwern 07 September 2012 05:40:48PM *  3 points [-]

There is no need to wish or strive for regulatory changes that may never happen: I've pointed out in the past that non-money prediction markets generally are pretty accurate and competitive with money prediction markets; so money does not seem to be a crucial factor. Just systematic tracking and judgment.

(Being able to profit may attract some people, like me, but the fear of loss may also serve as a potent deterrent to users.)

I have written at length about how I believe prediction markets helped me but I have been helped even more by the free active you-can-sign-up-right-now-and-start-using-it,-really,-right-now http://www.PredictionBook.com

I routinely use LW-related ideas and strategies in predicting, and I believe my calibration graph reflects genuine success at predicting.

Comment author: cousin_it 07 September 2012 08:22:23AM 1 point [-]

Very nice idea, thanks! After some googling I found someone already made this suggestion in 2009.

Comment author: Vladimir_Nesov 15 March 2012 06:46:13PM *  5 points [-]

I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.

Comment author: cousin_it 15 March 2012 07:09:11PM *  1 point [-]

Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.

Comment author: Luke_A_Somers 15 March 2012 06:29:56PM 1 point [-]

I think it's not an ethical imperative unless you're unusually altruistic.

... or you estimate the risk to be significant and you want to live past the next N years.

Comment author: cousin_it 15 March 2012 06:40:59PM *  7 points [-]

I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.

Comment author: RichardKennaway 15 March 2012 07:27:06AM 14 points [-]

How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?

"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."

Comment author: cousin_it 15 March 2012 07:52:37AM *  9 points [-]

I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.

Comment author: Will_Newsome 19 March 2012 03:13:16AM *  9 points [-]

(the way Will_Newsome wants you to),

I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.

Comment author: XiXiDu 15 March 2012 10:54:49AM *  7 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

Comment author: Will_Newsome 19 March 2012 03:22:56AM *  9 points [-]

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...

This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.

— Nick Tarleton's twist on T.S. Eliot

Comment author: katydee 16 March 2012 05:29:43PM 10 points [-]

I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.

Comment author: Tripitaka 15 March 2012 06:15:47PM *  6 points [-]
  1. Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money.
  2. As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised!
  3. Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven't had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
Comment author: cousin_it 15 March 2012 12:35:19PM *  11 points [-]

If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.

Comment author: Will_Newsome 19 March 2012 03:18:01AM 5 points [-]

Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.

Comment author: drethelin 16 March 2012 05:42:30PM 0 points [-]

I'm too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.

Comment author: Grognor 15 March 2012 07:24:13AM 1 point [-]

I agree entirely. That post made me go "AAAH" and its rapid karma increase at first made me go "AAAAHH"

Comment author: XiXiDu 15 March 2012 10:49:06AM 6 points [-]

People regularly accuse LW of this, which is outright batshit. XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong.

Thanks for saying that I significantly helped to make Less Wrong look less cultish ;-)

By the way...

Comment author: jimrandomh 15 March 2012 07:58:14PM 1 point [-]

Actually, I believe what he said was that you generated evidence that Less Wrong is not cultish, which makes it look more cultish to people who aren't thinking carefully.

Comment author: John_Maxwell_IV 15 March 2012 03:14:19AM *  21 points [-]

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.

In other words, I'm talking about first impressions, not in-depth discussions.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.

Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)

And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think we could stand to be a bit nicer everywhere).

XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong. Not to mention all the other posts disagreeing with claims in what are usually called the Sequences, all highly upvoted by Less Wrong members.

Criticism rocks dude. I'm constantly realizing that I did something wrong and thinking that if I had a critical external observer maybe I wouldn't have persisted in my mistake for so long. Let's keep this social norm up.

Comment author: Antisuji 16 March 2012 12:59:50AM *  5 points [-]

Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living. Those people tend to be on the far end of the spectrum of what we call neurotypical here. That is, they are extremely good at modeling other people, and therefore predicting how other people will react to a sample of copy. I would not be surprised if literally no one who reads LW regularly could do the job adequately.

Edit to add: it's nice to see that they're attempting to do this, but again, LW readership is probably the wrong place to look for this kind of expertise.

Comment author: wedrifid 16 March 2012 02:24:16AM 6 points [-]

Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living.

People who do this for a living (effectively) cost a lot of money. Given the budget of SIAI putting a communications professional on the payroll at market rates represents a big investment. Transitioning a charity to a state where a large amount of income goes into improving perception (and so securing more income) is a step not undertaken lightly.

Comment author: NancyLebovitz 16 March 2012 09:49:12AM 6 points [-]

It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.

Comment author: Antisuji 16 March 2012 03:59:58AM 1 point [-]

That's a good point. I'm guessing though that there's a lot of low hanging fruit, e.g. a front page redesign, that would represent a more modest (and one-time) expense than hiring a full-time flack. In addition to costing less this would go a long way to mitigate concerns of corruption. Let's use the Pareto Principle to our advantage!

Comment author: Grognor 15 March 2012 04:18:30AM 10 points [-]

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.

Okay.

If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.

I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.

Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.

Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).

Criticism rocks dude.

Okay. (I hope you didn't interpret anything I said as meaning otherwise.)

Comment author: John_Maxwell_IV 15 March 2012 04:25:14AM *  4 points [-]

Point taken; I'll leave the issue alone for now.

Comment author: XiXiDu 15 March 2012 11:01:26AM *  5 points [-]

I have become fully convinced that even bringing it up is actively harmful.

No, it is not. A lack of self-criticism and evaluation is one of the reasons for why people assign cult status to communities.

P.S. Posts with titles along the lines of 'Epistle to the New York Less Wrongians' don't help in reducing cultishness ;-)

(Yeah, I know it was just fun.)

Comment author: wedrifid 15 March 2012 03:52:19AM 13 points [-]

AAAAARRRGH! I am sick to death of this damned topic.

It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.

Comment author: John_Maxwell_IV 15 March 2012 11:46:50PM 3 points [-]

Unfortunately, Grognor reverts me every time I try to make those changes... Bystanders, please weigh in on this topic here.

Comment author: Vladimir_Nesov 16 March 2012 12:10:26AM *  3 points [-]

I didn't like your alternative for the "Many of us believe" line either, even though I don't like that line (it was what I came up with to improve on Luke's original text). To give the context: the current About page introduces twelve virtues with:

Many of us believe in the importance of developing qualities described in "Twelve Virtues of Rationality":

John's edit was to change it to:

For a brief summary of how to be rational, read the somewhat stylized "Twelve Virtues of Rationality":

P.S. I no longer supervise the edits to the wiki, but someone should...

Comment author: John_Maxwell_IV 16 March 2012 12:56:30AM *  0 points [-]

He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.

Comment author: wedrifid 16 March 2012 02:51:29AM 3 points [-]

He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.

If you made the suggestions here and received public support for one of them it wouldn't matter much what Grognor thought.

Comment author: John_Maxwell_IV 16 March 2012 03:01:41AM 0 points [-]

Why don't you make a suggestion?

Comment author: wedrifid 16 March 2012 03:04:48AM *  3 points [-]

*cough* Mine is 'delete the sentence entirely'. I never really liked that virtues page anyway!

Comment author: lessdazed 20 March 2012 07:50:08PM 1 point [-]

I entirely agree with this.

Comment author: John_Maxwell_IV 17 March 2012 05:52:39AM 1 point [-]

To be clear, you are in favor of leaving the virtues off of the about page, correct?

Comment author: wedrifid 17 March 2012 06:01:03AM 1 point [-]

For what it is worth, yes.

Comment author: John_Maxwell_IV 16 March 2012 03:06:17AM 2 points [-]

Sounds like a great idea.

Comment author: wedrifid 16 March 2012 02:49:33AM -1 points [-]

That change is less bad than the original but it is sometimes better to hold off on changes that may reduce the impetus for further improvement without quite satisfying the need.

Comment author: John_Maxwell_IV 16 March 2012 03:00:57AM 0 points [-]

To be honest, I don't have much energy left to fight this. I'd like to rethink the entire page, but if I have to fight tooth and nail for every sentence I won't.

Comment author: wedrifid 16 March 2012 02:49:46AM 0 points [-]

Who on earth is Grognor?

Comment author: Grognor 16 March 2012 05:46:19AM 5 points [-]
Comment author: Nisan 16 March 2012 05:27:49AM *  0 points [-]

In. Who in earth.

Comment author: wedrifid 16 March 2012 07:51:26AM 1 point [-]

In. Who in earth.

Is this a jest about Grognor sounding like the name of a dwarf or a mythical beast of the depths?

Comment author: Nisan 16 March 2012 03:42:33PM 2 points [-]

I'm afraid so.

Comment author: dbaupp 15 March 2012 02:48:49AM *  7 points [-]

Eliezer: Do not worship a hero! Do not trust!

Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

A widely revered figure who has written a million+ words that form the central pillars of LW and has been directly (or indirectly) responsible for bringing many people into the rationality memespace says "don't do X" so it is obvious that X must be false.

</sarcasm>

Dismissing accusations of a personality cult around Eliezer by saying Eliezer said "no personality cult" is a fairly poor way of going about it. Two key points:

  • saying "as a group, we don't worship Eliezer" doesn't guarantee that it is true (groupthink could easily suck us into ignore evidence)
  • someone might interpret what Eliezer said as false modesty or an attempt to appear to be a reluctant saviour/messiah (i.e. using dark arts to suck people in)
Comment author: halcyon 20 August 2012 11:40:59AM *  0 points [-]

Actually, I believe the optimal utilitarian attitude would be to make fun of them. If you don't take them at all seriously, they will grow to doubt themselves. If you're persistently humorous enough, some of them, thinking themselves comedians, will take your side in poking fun at the rest. In time, LW will have assembled its own team of Witty Defenders responsible for keeping non-serious accusations at bay. This will ultimately lead to long pages of meaningless back and forth between underlings, allowing serious LWians to ignore these distracting subjects altogether. Also, the resulting dialogue will advertize the LW community, while understandably disgusting self-respecting thinkers of every description, thus getting them interested in evaluating the claims of LW on its own terms.

Personally, I think all social institutions are inevitably a bit cultish, (society = mob - negative connotations) and they all use similarly irrational mechanisms to shield themselves from criticism and maintain prestige. A case could be made that they have to, one reason being that most popular "criticism" is of the form "I've heard it said or implied that quality X is to be regarded as a Bad Thing, and property Y of your organization kind of resembles X under the influence of whatever it is that I'm smoking," or of equally abysmal quality. Heck, the United States government, the most powerful public institution in the world, is way more cultish than average. Frankly, more so than LW has ever been accused of being, to my knowledge. Less Wrong: Less cultish than America!