If you type "less wrong c" or "singularity institute c" into Google, you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency. (EDIT: Please avoid testing this out, so Google doesn't autocomplete your search and reinforce their positions. This kind of problem can be hard to get rid of. Click these instead: less wrong cult, singularity institute cult.)
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

I have several questions related to this:

  • Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
  • If so, can you suggest any easy steps we could take?
  • Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
  • Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
  • Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.

If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Here are the worst violators I see on that about page:

Some people consider the Sequences the most important work they have ever read.

Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.

Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]

And on the sequences page:

If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.

This seems obviously false to me.

These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.

Cult impressions of Less Wrong/Singularity Institute
New Comment
247 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.

I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:

Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

&

Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Really, am I the only one seeing the problem with this?

People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they p... (read more)

LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.

How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?

"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."

I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.

(the way Will_Newsome wants you to),

I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person

You mean when he saw himself in the mirror? :)

Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?

I think it's not an ethical imperative unless you're unusually altruistic.

Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.

9Wei Dai
It's hard not to rewrite someone's goals while educating them, because one of our inborn drives is to gain the respect and approval of people around us, and if that means overwriting some of our goals, well that's a small price to pay as far as that part of our brain is concerned. For example, I stayed for about a week at the SIAI house a few years ago when attending the decision theory workshop, and my values shift in obvious ways just by being surrounded by more altruistic people and talking with them. (The effect largely dissipated after I left, but not completely.) Presumably the people they selected for the rationality mini-camp were already more altruistic than average, and the camp itself pushed some of them to the "unusually altruistic" level. Why should SIAI people have qualms about this (other than possible bad PR)?
-4TheAncientGeek
Pointing out that religious/cultic value rewriting is hard to avoid hardly refues the idea that LW is a cult.
9Vladimir_Nesov
I don't think "unusually altruistic" is a good characterization of "doesn't value personal preferences about some life choices more than the future of humanity"...
6cousin_it
Do you believe most people are already quite altruistic in that sense? Why? It seems to me that many people give lip service to altruism, but their actions (e.g. reluctance to donate to highly efficient charities) speak otherwise. I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)

I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.

3cousin_it
Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.
7Vladimir_Nesov
This is not what the posts are about, even if this works as one of the conclusions. The idea that urges and goals should be distinguished, for example, doesn't say what your urges or goals should be, it stands separately on its own. There are many such results, and ideas such as altruism or importance of FAI are only few among them. Do these ideas demonstrate comparatively more visible measurable effect than the other ideas?
2William_Quixote
if prediction markets were legal, we could much more easily measure if LW helped rationality. Just ask people to make n bets or predictions per month and see 1) it they did better than the population average and 2) if they improved over time. In fact, trying to get intrade legal in the US might be a very worthwhile project for just this reason ( beyond all the general social reasons to like prediction markets)
4gwern
There is no need to wish or strive for regulatory changes that may never happen: I've pointed out in the past that non-money prediction markets generally are pretty accurate and competitive with money prediction markets; so money does not seem to be a crucial factor. Just systematic tracking and judgment. (Being able to profit may attract some people, like me, but the fear of loss may also serve as a potent deterrent to users.) I have written at length about how I believe prediction markets helped me but I have been helped even more by the free active you-can-sign-up-right-now-and-start-using-it,-really,-right-now http://www.PredictionBook.com I routinely use LW-related ideas and strategies in predicting, and I believe my calibration graph reflects genuine success at predicting.
1cousin_it
Very nice idea, thanks! After some googling I found someone already made this suggestion in 2009.
0William_Quixote
If other people have suggested this before, there may be enouph background support to make it worth following up on this idea. When I get home from work, I will post in the discussion forum to see if people would be interested in working to legalize prediction markets ( like intrade) it the US. [EDITED: shortly after making this post, I saw Gwern’s post above suggesting that an alternative like prediction book would be just as good. As a result I did not make a post about legalizing prediction markets and instead tried prediction book for a month and a half. After this trial, I still think that making a push to legalize predictions markets would be worthwhile]
5Vaniver
It doesn't sound like you know all that many humans, then. In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.
2Vladimir_Nesov
I was talking about the future of humanity, not the "future of humanity" (a label that can be grossly misinterpreted).
0Luke_A_Somers
... or you estimate the risk to be significant and you want to live past the next N years.

I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.

-4Tripitaka
Funny in which way? Do you want to avoid an automatic "makro-of-denial"-invocation or are you afraid of them joining Eliezers evergrowing crowd of memetically subverted FAI-lers ?
0cousin_it
The latter, I think.
0[anonymous]
If I teach rationality and deliberately change my students' goals, that means I fail as a teacher. It's even worse if their new goal happens to be donating all their money to my organization.
5XiXiDu
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk? It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do. Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.

5Will_Newsome
Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.

I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...

This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.

— Nick Tarleton's twist on T.S. Eliot

9Tripitaka
1. Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money. 2. As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised! 3. Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven't had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
0drethelin
I'm too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.
3[anonymous]
Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.
0[anonymous]
Pascal? Izzat you?
0[anonymous]
That comment doesn't actually argue for contributing to FAI development. So I guess I'm not Pascal (damn).
1[anonymous]
You probably don't wanna be Pascal anyway. I'm given to understand he's been a metabolic no-show for about 350 years.
1Grognor
I agree entirely. That post made me go "AAAH" and its rapid karma increase at first made me go "AAAAHH"

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.

In other words, I'm talking about first impressions, not in-depth discussions.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.

Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)

And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think ... (read more)

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.

Okay.

If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.

I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.

Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.

Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).

Criticism rocks dude.

Okay. (I hope you didn't interpret anything I said as meaning otherwise.)

8John_Maxwell
Point taken; I'll leave the issue alone for now.
5Antisuji
Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living. Those people tend to be on the far end of the spectrum of what we call neurotypical here. That is, they are extremely good at modeling other people, and therefore predicting how other people will react to a sample of copy. I would not be surprised if literally no one who reads LW regularly could do the job adequately. Edit to add: it's nice to see that they're attempting to do this, but again, LW readership is probably the wrong place to look for this kind of expertise.
8wedrifid
People who do this for a living (effectively) cost a lot of money. Given the budget of SIAI putting a communications professional on the payroll at market rates represents a big investment. Transitioning a charity to a state where a large amount of income goes into improving perception (and so securing more income) is a step not undertaken lightly.

It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.

2Antisuji
That's a good point. I'm guessing though that there's a lot of low hanging fruit, e.g. a front page redesign, that would represent a more modest (and one-time) expense than hiring a full-time flack. In addition to costing less this would go a long way to mitigate concerns of corruption. Let's use the Pareto Principle to our advantage!

AAAAARRRGH! I am sick to death of this damned topic.

It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.

5John_Maxwell
Unfortunately, Grognor reverts me every time I try to make those changes... Bystanders, please weigh in on this topic here.
3Vladimir_Nesov
I didn't like your alternative for the "Many of us believe" line either, even though I don't like that line (it was what I came up with to improve on Luke's original text). To give the context: the current About page introduces twelve virtues with: John's edit was to change it to: P.S. I no longer supervise the edits to the wiki, but someone should...
0John_Maxwell
He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.
4wedrifid
If you made the suggestions here and received public support for one of them it wouldn't matter much what Grognor thought.
0John_Maxwell
Why don't you make a suggestion?
5wedrifid
*cough* Mine is 'delete the sentence entirely'. I never really liked that virtues page anyway!
3John_Maxwell
Sounds like a great idea.
2lessdazed
I entirely agree with this.
2John_Maxwell
To be clear, you are in favor of leaving the virtues off of the about page, correct?
1wedrifid
For what it is worth, yes.
0John_Maxwell
Okay, thanks. One of the other wiki editors didn't think you meant that.
2Vladimir_Nesov
Whatever wedrifid actually meant is not "apparent consensus", given that there's just 2 upvotes on the statement where it wasn't apparent to the voters what he actually meant... Reverted with suggestion to escalate to a discussion post and voting more clearly. Also, this started from talking about bad wording, which is a separate question from leaving the section out altogether, so the hypothetical discussion posting should distinguish those questions.
-2John_Maxwell
Okay.
-1wedrifid
That change is less bad than the original but it is sometimes better to hold off on changes that may reduce the impetus for further improvement without quite satisfying the need.
0John_Maxwell
To be honest, I don't have much energy left to fight this. I'd like to rethink the entire page, but if I have to fight tooth and nail for every sentence I won't.
-1wedrifid
Who on earth is Grognor?
7Grognor
Hi?
1Nisan
In. Who in earth.
1wedrifid
Is this a jest about Grognor sounding like the name of a dwarf or a mythical beast of the depths?
4Nisan
I'm afraid so.

A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:

"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"

Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)

"LW is a cult hur hur"

People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.

"since it's exactly what Eliezer was trying to combat by writing it."

There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.

5XiXiDu
Thanks for saying that I significantly helped to make Less Wrong look less cultish ;-) By the way...
2jimrandomh
Actually, I believe what he said was that you generated evidence that Less Wrong is not cultish, which makes it look more cultish to people who aren't thinking carefully.
5dbaupp
A widely revered figure who has written a million+ words that form the central pillars of LW and has been directly (or indirectly) responsible for bringing many people into the rationality memespace says "don't do X" so it is obvious that X must be false. Dismissing accusations of a personality cult around Eliezer by saying Eliezer said "no personality cult" is a fairly poor way of going about it. Two key points: * saying "as a group, we don't worship Eliezer" doesn't guarantee that it is true (groupthink could easily suck us into ignore evidence) * someone might interpret what Eliezer said as false modesty or an attempt to appear to be a reluctant saviour/messiah (i.e. using dark arts to suck people in)
2epicureanideal
"I have become fully convinced that even bringing it up is actively harmful." What evidence leads you to this conclusion? Can you provide evidence to support this characterization? Can you provide evidence to support this characterization? I would like to see some empirical analysis of the points made here and by the original poster. We should gather some data about perceptions from real users and use that to inform future discussion on this topic. I think we have a starting point in the responses to this post, and comments in other posts could probably be mined for information, but we should also try to find some rational people who are not familiar with less wrong and introduce them to it and ask them for their impressions (from someone acting like they just found the site, are not affiliated with it, and are curious about their friend's impressions, or something like that).
2XiXiDu
No, it is not. A lack of self-criticism and evaluation is one of the reasons for why people assign cult status to communities. P.S. Posts with titles along the lines of 'Epistle to the New York Less Wrongians' don't help in reducing cultishness ;-) (Yeah, I know it was just fun.)
0halcyon
Actually, I believe the optimal utilitarian attitude would be to make fun of them. If you don't take them at all seriously, they will grow to doubt themselves. If you're persistently humorous enough, some of them, thinking themselves comedians, will take your side in poking fun at the rest. In time, LW will have assembled its own team of Witty Defenders responsible for keeping non-serious accusations at bay. This will ultimately lead to long pages of meaningless back and forth between underlings, allowing serious LWians to ignore these distracting subjects altogether. Also, the resulting dialogue will advertize the LW community, while understandably disgusting self-respecting thinkers of every description, thus getting them interested in evaluating the claims of LW on its own terms. Personally, I think all social institutions are inevitably a bit cultish, (society = mob - negative connotations) and they all use similarly irrational mechanisms to shield themselves from criticism and maintain prestige. A case could be made that they have to, one reason being that most popular "criticism" is of the form "I've heard it said or implied that quality X is to be regarded as a Bad Thing, and property Y of your organization kind of resembles X under the influence of whatever it is that I'm smoking," or of equally abysmal quality. Heck, the United States government, the most powerful public institution in the world, is way more cultish than average. Frankly, more so than LW has ever been accused of being, to my knowledge. Less Wrong: Less cultish than America!

The top autocompletes for "Less Wrong" are

  • sequences
  • harry potter
  • meetups

These are my (logged-in) Google results for searching "Less Wrong_X" for each letter of the alphabet (some duplicates appear):

  • akrasia
  • amanda knox
  • atheism
  • australia
  • blog
  • bayes
  • basilisk
  • bayes theorem
  • cryonics
  • charity
  • cult
  • discussion
  • definition
  • decoherence
  • decision theory
  • epub
  • evolutionary psychology
  • eliezer yudkowsky
  • evidence
  • free will
  • fanfiction
  • fanfic
  • fiction
  • gender
  • games
  • goals
  • growing up is hard
  • harry potter
  • harry potter and the methods of rationality
  • how to be happy
  • hindsight bias
  • irc
  • inferential distance
  • iq
  • illusion of transparency
  • joint configurations
  • joy in the merely real
  • kindle
  • amanda knox
  • lyrics
  • luminosity
  • lost purposes
  • leave a line of retreat
  • meetup
  • mobi
  • meditation
  • methods of rationality
  • newcomb's problem
  • nyc
  • nootropics
  • neural categories
  • optimal employment
  • overcoming bias
  • open thread
  • outside the laboratory
  • procrastination
  • pdf
  • polyamory
  • podcast
  • quantum physics
  • quotes
  • quantum mechanics
  • rationality quotes
  • rationality quotes
  • rationalwiki
  • reading list
  • rationality
  • sequences
  • survey
  • survey results
  • sequences pdf
  • twitter
  • textbooks
  • three worlds collide
  • toronto
  • ugh fields
  • universal fire
  • v
... (read more)
0timtyler
Luke's link to How Cults work is pretty funny.

Google's autocomplete has a problem, which has produced controversy in other contexts: when people want to know whether X is trustworthy, the most informative search they can make is "X scam". Generally speaking, they'll find no results and that will be reassuring. Unfortunately, Google remembers those searches, and presents them later as suggestions - implying that there might be results behind the query. Once the "X scam" link starts showing up in the autocomplete, people who weren't really suspicious of X click on it, so it stays there.

6beoShaffer
Personal anecdote warning. I semi-routinely google the phrase "X cult" when looking into organizations.

Does this ever work?

3beoShaffer
I think so, but it's hard to say. I look into organizations infrequently enough that semi-routinely leaves me a very small sample size. The one organization that had prominent cult results(not going to name it for obvious reasons) does have several worrying qualities. And they seem related to why it was called a cult. -edit minor grammar/style fix
2John_Maxwell
Thanks; I updated the post to reflect this.

Eliezer addressed this in part with his "Death Spiral" essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:

  • Having a house where core members live together.
  • Asking followers to completely adjust their thinking processes to include new essential concepts, terminologies, and so on to the lowest level of understanding reality.
  • Claiming that only if you carry out said mental adjustment can you really understand the most important parts of the organization's philosophy.
  • Asking for money for a charity, particularly one which does not quite have the conventional goals of a charity, and claiming that one should really be donating a much larger percentage of one's income than most people donate to charity.
  • Presenting an apocalyptic scenario including extreme bad and good possibilities, and claiming to be the best positioned to deal with it.
  • [Added] Demand you leave any (other) religion.

Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one's approach just for appearances.

Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.

Oh yes, and fact that the leader wants to SAVE THE WORLD.

5Bongo
About a third in 2009, the last year for which we have handy data.
1timtyler
Practically all of it goes to them or their "associates" - by my reckoning. In 2009 some was burned on travel expenses and accomodation, some was invested - and some was stolen. Who was actually helped? Countless billions in the distant future - supposedly.
8dbaupp
What else should it go to? (Under the assumption that SI's goals are positive.) As Larks said above, they are doing thought work: they are not trying to ship vast quantities of food or medical supplies. The product of SI is the output from their researchers, the only way to get more output is to employ more people (modulo improving the output of the current researchers, but that is limited).
6timtyler
So, to recap, this is a proposed part of a list of ways in which the SIAI resembles a cult. It redistribtutes economic resources from the "rank and file" members up the internal heirarchy without much expenditure on outsiders - just like many cults do.
5dbaupp
(Eh. Yes, I think I lost track of that a bit.) Keeping that in mind: SI has a problem because acting to avoid appearing to exist to give money to the upper ranks means that they can't pay their researchers. There are three broad classes of solutions to this (that I can see): * Give staff little to no compensation for their work * Use tricky tactics to try to conceal how much money goes to the staff * Try to explain to everyone why such a large proportion of the money goes to the staff All of those seem suboptimal.
5epicureanideal
Why was this downvoted instead of responded to? Downvoting people who are simply stating negative impressions of the group doesn't improve impressions of the group.
5JoshuaFox
Most organizations spend most of their money on staff. What else could you do with it? Paying fellowships for "external staff" is a possibility. But in general, good people are exactly what you need.
2timtyler
Often goods or needy beneficiaries are also involved. Charity actions are sometimes classified into: * Program Expenses * Administrative Expenses * Fundraising Expenses This can be used as a heuristic for identifying good charities. Not enough in category 1 and too much in categories 2 and 3 is often a bad sign.
[-]Larks120

But they're not buying malaria nets, they're doing thought-work. Do you expect to see an invoice for TDT?

Quite appart from the standard complaint about how awful a metric that is.

-1epicureanideal
And yet there are plenty of things that don't cost much money that they could be doing right now, that I have previously mentioned to SIAI staff and will not repeat (edit: in detail) because it might interfere with my own similar efforts in the near future. Basically I'm referring to public outreach, bringing in more members of the academic community, making people aware that LW even exists (I wasn't except when I randomly ran into a few LWers in person), etc. What's the reason for downvoting this? Please comment.
3epicureanideal
As I've discussed with several LWers in person, including some staff and visiting fellows, one of the things I disliked about LW/SIAI was that so much of the resources of the organization go to pay the staff. They seemingly wouldn't even consider proposals to spend a few hundred dollars on other things because they claimed it was "too expensive".
5TheAncientGeek
add * Leader(s) are credited with expertise beyond that convenrional experts in subjects they are not conventionally qualified in. * Studying conventional versions of subjects is deprecated in favour of in group versions.
3JoshuaFox
Also: Associated with non-standard and non-monogamous sexual practices. (Just some more pattern-matching on top of what you see in the parent and grandparent comment. I don't actually think this is a strong positive indicator.)
3TheAncientGeek
The usual version of that indicator is "leader has sex with followers"
2MTGandP
One fundamental difference between LW and most cults is that LW tells you to question everything, even itself.
7gwern
Most, but not all. The Randians come to mind. Even the Buddha encouraged people to be critical, but doesn't seem to have stopped the cults. I was floored to learn a few weeks ago that Buddhism has formalized even when you stop doubting! When you stop doubting, you become a Sotāpanna; a Sotāpanna is marked by abandoning '3 fetters', the second fetter according to Wikipedia being As well, as unquestioningness becomes a well known trait of cults, cults tend to try to hide it. Scientology hides the craziest dogmas until you're well and hooked, for example.
2[anonymous]
If the Randians are a cult, LW is a cult. Like the others, the members just think it's unique in being valid.
2Desrtopa
If a person disagrees with Rand about a number of key beliefs, do they still count as a Randian?
0Peterdjones
If they don't count as an Orthodox Randian, they can always become a Liberal Randian
0[anonymous]
That depends the largest part on what "a number of key beliefs" is.
-2MugaSofer
Could you elaborate on this?
0MTGandP
So there comes a point in Buddhism where you're not supposed to be skeptical anymore. And Objectivists aren't supposed to question Ayn Rand.
1Mitchell_Porter
Would it be productive to be skeptical about whether your login really starts with the letter "M"? Taking an issue off the table and saying, we're done with that, is not in itself a bad sign. The only question is whether they really do know what they think they know. I personally endorse the very beginning of Objectivist epistemology - I mean this: "Existence exists—and the act of grasping that statement implies two corollary axioms: that something exists which one perceives and that one exists possessing consciousness, consciousness being the faculty of perceiving that which exists." It's the subsequent development which is a mix of further gemlike insights, paths not taken, and errors or uncertainties that are papered over. In the case of Buddhism, one has the usual problem of knowing, at this historical distance, exactly what psychological and logical content defined "enlightenment". One of its paradoxes is that it sounds like the experience of a phenomenological truth, and yet the key realization is often presented as the discovery that there is no true self or substantial self. I would have thought that achieving reflective consciousness implied the existence of a reflector, just as in the Objectivist account. Then again, reflection can also produce awareness that traits with which you have identified yourself are conditioned and contingent, so it can dissolve a naive concept of self, and that sounds more like the Buddhism we hear about today. The coexistence of a persistent observing consciousness, and a stream of transient identifications, in certain respects is like Hinduism; though the Buddhists can strike back by saying that the observing consciousness is not eternal and free of causality, it too exists only if it has been caused to exist. So claims to knowledge, and the existence of a stage where you no longer doubt that this really is knowledge, and get on with developing the implications, do not in themselves imply falsity. In a systematic philosophy
0timtyler
There seems to be some detailed substructure there - which I go over here.
-1timtyler
Not just a cult - an END OF THE WORLD CULT. My favourite documentary on the topic: The End of The World Cult.

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.

For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.

AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.

There's also the pro... (read more)

Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.

Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).

Upvoted for sounding a lot like the kinds of complaints I've heard people say about LW and SIAI.

There is a large barrier to entry here, and if we want to win more, we can't just blame people for not understanding the message. I've been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:

  • We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren't exactly signalling openness or friendliness.

  • We have a tendency to preach to the converted. Or as the friend said:

    It's that classic mistake of talking in a way where you're convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.

    He brought up an example for how material might be introduced to newly exposed folk.

    If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

The curse of knowledg... (read more)

If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

That's an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary-- when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I'd heard in the course of a lot of years of listening to NPR.

I'm not saying it's so magical that it can't be equaled, I'm saying that it might be worth studying.

Here's what an outsider might see:

"doomsday beliefs" (something "bad" may happen eschatologically, and we must work to prevent this): check

a gospel (The Sequences): check

vigorous assertions of untestable claims (Everett interpretation): check

a charismatic leader extracting a living from his followers: check

is sometimes called a cult: check

This is enough to make up a lot of minds, regardless of any additional distinctions you may want to make, sadly.

5[anonymous]
But an outsider would have to spend some time here to see all those things. If they think LW is accurately described by the c-word even after getting acquainted with the site, there might be no point in trying to change their minds. It's better to focus on people who are discouraged by first impressions.
2advancedatheist
I recently read an article about Keith Raniere, the founder of a cult called NXIVM (pronounced "nexium"): http://www.timesunion.com/local/article/Secrets-of-NXIVM-2880885.php Raniere reminds me of Yudkowsky, especially after reading cult expert Rick Ross's assessment of Raniere:
[-]gRR190

I'm here for only a couple of months, and I didn't have any impression of cultishness. I saw only a circle of friends doing a thing together, and very enthusiastic about it.

What I also did see (and still do) is specific people just sometimes being slightly crazy, in a nice way. As in: Eliezer's threatment of MWI. Or way too serious fear of weird acausal dangers that fall out of currently best decision theories.
Note: this impression is not because of craziness of the ideas, but because of taking them too seriously too early. However, the relevant posts always have sane critical comments, heavily upvoted.

I'm slightly more alarmed by posts like How would you stop Moore's Law?. I mean, seriously thinking of AI dangers is good. Seriously considering nuking Intel's fabs in order to stop the dangers is... not good.

3Luke_A_Somers
Agreed, except the treatment of WMI does not seem the least bit crazy to me. But what do I know - I'm a crazy physicist.
5roystgnr
The conclusions don't seem crazy (well, they seem "crazy-but-probably-correct", just like even the non-controversial parts of quantum mechanics), but IIRC the occasional emphasis on "We Have The One Correct Answer And You All Are Wrong" rang some warning bells. On the other hand: Rationality is only useful to the extent that it reaches conclusions that differ from e.g. the "just believe what everyone else does" heuristic. Yet when any other heuristic comes up with new conclusions that are easily verified, or even new conclusions which sound plausible and aren't disproveable, "just believe what everyone else does" quickly catches up. So if you want a touchstone for rationality in an individual, you need to find a question for which rational analysis leads to an unverifiable, implausible sounding answer. Such a question makes a great test, but not such a great advertisement...
0Dmytry
Choosing between mathematically equivalent interpretations adds 1 bit of complexity that doesn't need to be added. Now, if EY had derived the Born probabilities from first principles, that'd be quite interesting.
2wedrifid
That's a positive impression. People really look that enthusiastic and well bonded?
8gRR
Yes to well bonded. People here seem to understand each other far better than average on the net, and it is immediately apparent. Enthusiastic is a wrong word, I suppose. I meant, sure of doing a good thing, happy to be doing it, etc, not in the sense of applauding and cheering.
4wedrifid
Thankyou. It is good to be reminded that these things are relative. Sometimes I forget to compare interactions to others on the internet and instead compare them to interactions with people as I would prefer them to be or even just interactions with people I know in person (and have rather ruthlessly selected for not being annoying).

Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.

An acquaintance of mine attends Less Wrong meetups and describes most of his friends as being Less Wrongers, but doesn't read Less Wrong and privately holds reservations about the entire singularity thing, saying that we can't hope to say much about the future more than 10 years in advance. He told me that one of his coworkers is also skeptical of the singularity.

A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".

I was interviewing for an internship once, and the interviewer and I realized we had a mutual acquaintance who was a Less Wronger and SI donor. He asked me if I was part of that entire group, and I said yes. His attitude was a bit derisive.

5timtyler
The FHI are trying to do a broadly similar thing from within academia. They seem less kooky and cultish - probably as a result of trying harder to avoid cultishness.
1[anonymous]
I don't know why you would assume that it's "probably as a result of trying harder to avoid cultishness." My prior is that they just don't seem cultish because academics are often expected to hold unfamiliar positions.
5BrandonReinhart
I will say that I feel 95% confident that SIAI is not a cult because I spent time there (mjcurzi was there also), learned from their members, observed their processes of teaching rationality, hung out for fun, met other people who were interested, etc. Everyone involved seemed well meaning, curious, critical, etc. No one was blindly following orders. In the realm of teaching rationality, there was much agreement it should be taught, some agreement on how, but total openness to failure and finding alternate methods. I went to the minicamp wondering (along with John Salvatier) whether the SIAI was a cult and obtained lots of evidence to push me far away from that position. I wonder if the cult accusation in part comes from the fact that it seems too good to be true, so we feel a need for defensive suspicion. Rationality is very much about changing one's mind and thinking about this we become suspicious that the goals of SIAI are to change our minds in a particular way. Then we discover that in fact the SIAI's goals (are in part) to change our minds in a particular way so we think our suspicions are justified. My model tells me that stepping into a church is several orders of magnitude more psychologically dangerous than stepping into a Less Wrong meetup or the SIAI headquarters. (The other 5% goes to things like "they are a cult and totally duped me and I don't know it", "they are a cult and I was too distant from their secret inner cabals to discover it", "they are a cult and I don't know what to look for", "they aren't a cult but they want to be one and are screwing it up", etc. I should probably feel more confident about this than 95%, but my own inclination to be suspicious of people who want to change how I think means I'm being generous with my error. I have a hard time giving these alternate stories credit.)
4daenerys
I would consider myself a pretty far outlier on LessWrong (as a female, ENFP (people-person, impulsive/intuitive), Hufflepuff type). So on one hand, my opinion may mean less, because I am not generally the "type" of person associated with LW. On the other hand, if you want to expand LW to more people, then I think some changes need to be made for other "types" of people to also feel comfortable here. Along with the initial "cult" impression (which eventually dissipates, IMO), what threw me most is the harshness of the forums. I've been on here for about 4 months now, and it's still difficult for me to deal with. Also, I agree that topics like FAI, and Singularitarianism aren't necessarily the best things to be discussing when trying to get people interested in rationality. I am well-aware that the things that would make LW more comfortable for me and others like me, would make it less comfortable for many of the current posters. So there is definitely a conflict of goals. Goal A- Grow LW and make rationality more popular- Need to make LW more "nice" and perhaps focused on Instrumental Rationality rather than Singularity and FAI issues. Goal B- Maintain current culture and level of posts.- Need to NOT significantly change LW, and perhaps focus more on the obscure posts that are extremely difficult for newer people to understand. AFAICT pursuit of either of these goals will be at the detriment of the other goal.
0NancyLebovitz
Could you be more specific about what comes off as harsh to you? If you'd rather address this as a private message, I'm still interested.
5John_Maxwell
What comes across as harsh to me: down voting discussion posts because they're accidental duplicates/don't fit some idea of what a discussion post is supposed to be, a lot of down voting that goes on in general, unbridled or curt disagreement (like grognor's response to my post. You saw him cursing and yelling, right? I made this post because I thought the less wrong community could use optimization on the topics I wrote about, not because I wanted to antagonize anyone.)
0daenerys
PM'd response. General agreement with John below (which I didn't see until just now).
3Nisan
This person might have been in the same place as a math grad student I know. They read a little Less Wrong and were turned off. Then they attended a LW-style rationality seminar and responded positively, because it was more "compassionate". What they mean is this: A typical epistemology post on Less Wrong might sound something like (That's not a quote.) Whereas the seminar sounded more like Similarly, an instrumental-rationality post here might sound like Whereas the seminar sounds more like Of course, both approaches are good and necessary, and you can find both on Less Wrong.

Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that's exactly what a cult would do, isn't it?

To be accused is to be convicted, because the allegation is unfalsifiable.

Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.

The best way to win is to avoid the topic.

Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: "So, what exactly is that evil thing people on LW did? Downvote someone's forum post? Seriously, guys, you need to get some life."

And now, everybody stop worrying and get some life. ;-)

It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discu... (read more)

People discussing best diet habits don't seem like a doomsday cult, right?

I'm having trouble thinking up examples of cults, real or fictional, that don't take an interest in what their members eat and drink.

2epicureanideal
I don't think the best way to win is to avoid the topic. A healthy discussion of false impressions and how to correct them, or other failings a group may have, is a good indication to me of a healthy community. This post for example caused my impression of LW to increase somewhat, but some of the responses to it have caused my impression to decrease below its original level.
6Viliam_Bur
Then let's discuss "false impressions" or even better "impressions" in general, not focusing on cultishness, which even cannot be defined (because there are so many different kind of cults). If we focus on making things right, we do not have to discuss hundred ways they could go wrong. What is our community (trying to be) like? Friendly. In more senses of the word: we speak about ethics, we are trying to make a nice community, we try to help each other become stronger and win. Rational. Instead of superstition and gossip, we discuss how and why things really happen. Instead of happy death spirals, we learn about the world around us. Professional. By that I do not mean that everyone here is an AI expert, but that the things we do and value here (studying, politeness, exactness, science) are things that for most people correlate positively with their jobs, rather than free time. Even when we have fun, it's adult people having fun. So where exactly in the space of human organizations do we belong? Which of the cached-thoughts can be best applied to us? People will always try to fit us to some existing model (for example: cult), so why not choose this model rationally? I am not sure, but "educational NGO" sounds close. Science, raising the sanity waterline, et cetera. By seeming as something well-known, we become less suspicious, more normal.
1MugaSofer
This. Seriously, we need to start doing all the stuff recommended here, but this is perhaps the simplest and most immediate. Someone go do it.
-12Peterdjones
[-][anonymous]160

Some things that might be problematic:

We use the latest insights from cognitive science, social psychology, probability theory, and decision theory to improve our understanding of how the world works and what we can do to achieve our goals.

I don't think we actually do that. Insights, sure, but latest insights? Also, it's mostly cognitive science and social psychology. The insights from probability and decision theory are more in the style of the simple math of everything.

Want to know if your doctor's diagnosis is correct? It helps to understand Bayes' Theorem.

This might sound weird to someone who hasn't already read the classic example about doctors not being able to calculate conditional probabilities. Like we believe Bayes theorem magically grants us medical knowledge or something.

[the link to rationality boot-camp]

I'm not a native speaker of English so I can't really tell, but I recall people complaining that the name 'boot-camp' is super creepy.

On the about page:

Introduce yourself to the community here.

That's not cultish-sounding but it's unnecessarily imperative. Introduction thread is optional.

Disclaimer: My partner and I casually refer to LW meetups (which I attend and she does not) as "the cult".

That said, if someone asked me if LW (or SIAI) "was a cult", I think my ideal response might be something like this:

"No, it's not; at least not in the sense I think you mean. What's bad about cults is not that they're weird. It's that they motivate people to do bad things, like lock kids in chain-lockers, shun their friends and families, or kill themselves). The badness of being a cult is not being weird; it's doing harmful things — and, secondarily, in coming up with excuses for why the cult gets to do those harmful things. Less Wrong is weird, but not harmful, so I don't think it is a cult in the sense you mean — at least not at the moment.

"That said, we do recognize that "every cause wants to be a cult", that human group behavior does sometimes tend toward cultish, and that just because a group says 'Rationality' on the label does not mean it contains good thinking. Hoping that we're special and that the normal rules of human behavior don't apply to us, would be a really bad idea. It seems that staying self-critical, understanding how ... (read more)

What's bad about cults is not that they're weird. It's that they motivate people to do bad things...

People use "weird" as a heuristic for danger, and personally I don't blame them, because they have good Bayesian reasons for it. Breaking a social norm X is positively correlated with breaking a social norm Y, and the correlation is strong enough for most people to notice.

The right thing to do is to show enough social skill to avoid triggering the weirdness alarm signal. (Just like publishing in serious media is the right thing to avoid the "pseudoscience" label.) You cannot expect that outsiders will do an exception for LW, suspend their heuristics and explore the website deeply; that would be asking them to privilege a hypothesis.

If something is "weird", we should try to make it less weird. No excuses.

-1ryjm
So we should be Less Weird now? ;)
1Viliam_Bur
We should be winning. Less Weird is a good heuristic for winning (though a bad heuristic for a site name ).

Often by the time a cult starts doing harmful things, its members have made both real and emotional investments that turn out to be nothing but sunk costs. To avoid ever getting into such a situation, people come up with a lot of ways to attempt to identify cults based on nothing more than the non-harmful, best-foot-forward appearance that cults first try to project. If you see a group using "love bombing", for instance, the wise response is to be wary - not because making people feel love and self-esteem is inherently a bad thing, but because it's so easily and commonly twisted toward ulterior motives.

2CasioTheSane
That is until people start bombing factories to mitigate highly improbable existential risks.

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

What do you mean, "initially" ? I am still getting that impression ! For example, just count the number of times Eliezer (who appears to only have a single name, like Prince or Jesus) is mentioned in the other comments on this post. And he's usually mentioned in the context of, "As Eliezer says...", as though the mere fact that it is Eliezer who says these things was enough.

The obvious counter-argument to the above is, "I like the things Eliezer says because they make sense, not because I worship him personally", but... well... that's what one would expect a cultist to say, no ?

Less Wrongers also seem to have their own vocabulary ("taboo that term or risk becoming mind-killed, which would be un-Bayesian"). We spend a lot of time worrying about doomsday events that most people would consider science-fictional (at best). We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality. As far as warning signs go, we've got it covered...

Specialized terminology is really irritating to me personally, and off-putting to most new visitors I would think. If you talk to any Objectivists or other cliques with their own internal vocabulary, it can be very bothersome. It also creates a sense that the group is insulated from the rest of the world, which adds to the perception of cultishness.

8[anonymous]
I think the phrase 'raising the sanity waterline' is a problem. As is the vaguely religious language, like 'litany of Tarski'. I looked up the definiton of 'litany' to make sure I was picking up on a religious denotation and not a religious connotation, and here's what I got: Not a great word, I think. Also 'Bayesian Conspiracy.' There's no conspiracy, and there shouldn't be.

Agreed. I realize that the words like "litany" and "conspiracy" are used semi-ironically, but a newcomer to the site might not.

3William_Quixote
This wording may lose a few people, but it probably helps for many people as well. The core subject matter of rationality could very easily be dull or dry or "academic". The tounge-in-cheek and occasionally outright goofy humor makes the sequences a lot more fun to read. The tone may have costs, but not being funny has costs too. If you think back to college, more professors have students tune out by being boring than by being esoteric.
1Jiro
(Responding to old post.) One problem with such ironic usage is that people tend to joke about things that cause themselves stress, and that includes uncomfortable truths or things that are getting too close to the truth. It's why it actually makes sense to detain people making bomb jokes in airports. So just because the words are used ironically doesn't mean they can't reasonably be taken as signs of a cult--even by people who recognize that they are being used ironically. (Although this is somewhat mitigated by the fact that many cults won't allow jokes about themselves at all.)
2Eneasz
You'd have to be new to the entire internet to think those are being used seriously. And if you're THAT new, there's really very little that can be done to prevent misunderstanding no matter where you first land. On top of that, it's extremely unlikely someone very new to the internet would start their journey at LessWrong
0Martin-2
Mr. Jesus H. Christ is a bad example. Also there's this.

There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

The LW FAQ says: >

Why do you all agree on so much? Am I joining a cult?

We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all. Yes, some of the results that fall out of these basics sound weird if you haven't seen the reasoning behind them, but there's nothing in the laws of physics that prevents reality from sounding weird.

I suspect that putting a more human face on the front page, rather than just linky text, would help.

Perhaps something like a YouTube video version of the FAQ, featuring two (ideally personable) people talking about what Less Wrong is and is not, and how to get started on it. For some people, seeing is believing. It is one thing to tell them there are lots of different posters here and we're not fanatics; but that doesn't have the same impact as watching an actual human with body cues talking.

I don't believe LW is a cult, but I can see where intelligent, critical thinking people might get that impression. I also think that there may be elitist and clannish tendencies within LW that are detrimental in ways that could stand to be (regularly) examined. Vigilance against irrational bias is the whole point here, right? Shouldn't that be embraced on the group level as much as on an individual one?

Part of the problem as I see it is that LW can't decide if it's a philosophy/science or a cultural movement.

For instance, as already mentioned, there's a great deal of jargon, and there's a general attitude of impatience for anyone not thoroughly versed in the established concepts and terminology. Philosophies and sciences also have this problem, but the widely accepted and respected philosophical and scientific theories have proven themselves to the world (and weren't taken very seriously until they did). I personally believe there's a lot of substance to the ideas here, but LW hasn't delivered anything dramatic to the world at large. Until it does so it may remain, in the eyes of outsiders, as some kind of hybrid of Scientology and Objectivism - an insular group of people with a s... (read more)

In general, I think we could stand more community effort being put into optimizing our about page, which you can do now here.

Thank you for this.

(In light of my other comment, I should emphasize that I really mean that. It is not sarcasm or any other kind of irony.)

I have seen this problem afflict other intellectually-driven communities, and believe me, it is a very hard problem to shake. Be grateful we aren't getting media attention. The adage, "All press is good press", has definitely been proven wrong.

0John_Maxwell
I assume that my post has aggravated things? :o(

The word "cult" never makes discussions like these easier. When people call LW cultish, they are mostly just expressing that they're creeped out by various aspects of the community - some perceived groupthink, say. Rather than trying to decide whether LW satisfies some normative definition of the word "cult," it may be more productive to simply inquire as to why these people are getting creeped out. (As other commenters have already been doing.)

0[anonymous]
This exactly. It's safe to assume that when most people say some organization strikes them as being cultish, they're not necessarily keeping a checklist.

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

Somebody please do so. Those examples are just obviously bad.

[-]Shmi70

I got a distinct cultish vibe when I joined, but only from the far-out parts of the site, like UFAI, but not from the "modern rationality" discussions. When I raised the issue on #lesswrong, the reaction from most regulars was not very reassuring: somewhat negative and more emotional than rational. The same happened when I commented here. That's why I am looking forward to the separate rationality site, without the added untestable and useless to me EY's idiosyncrasies, such as the singularity, the UFAI and the MWI.

3jacoblyles
We should try to pick up "moreright.com" from whoever owns it. It's domain-parked at the moment.
5arborealhominid
Moreright.net already exists, and it's a "Bayesian reactionary" blog- that is, a blog for far-rightists who are involved in the Less Wrong community. It's an interesting site, but it strikes me as decidedly unhelpful when it comes to looking uncultish.

As I see it Cult =Clique + Weird Ideas

I think the weird ideas are an integral part of LessWrong, and any attempt to disguise them with a fluffy introduction would be counterproductive.

What about Cliquishness? I think that the problem here is that any internet forum tends to become a clique. To take part you need to read through lots of posts, so it requires quite a commitment. Then there is always some indication of your status within the group - Karma score in this case.

My advice would be to link to some non-internet things. Why not have the FHI news feed and links to a few relevant books on Amazon in the column on the right?

"Twelve Virtues of Rationality" has always seemed really culty to me. I've never read it, which may be part of the reason. It just comes across as telling people exactly how they should be, and what they should value.

Also, I've never liked that quote about the Sequences. I agree with it, what I've read of the sequences (and it would be wrong to not count HPMOR in this) is by far the most important work I've ever read. But that doesn't mean that's what we should advertise to people.

7Grognor
All you are saying here is "The title of the Twelve Virtues makes me feel bad." That is literally all you are saying, since you admit to not having read it. I quote: I'll tell you one thing. It got my attention. It got me interested in rationality. I've shown it to others; they all liked it or were indifferent. If you're going to say "culty" because of the title, you are both missing the (most important) point and failing to judge based on anything reasonable. And I don't particularly care if LW appeals to people who don't even try to be reasonable.
4[anonymous]
That's still an useful data-point. Do we want to scare away people with strong weirdness filters?

Do we want to scare away people with strong weirdness filters?

The answer to this may very well turn out to be yes.

3NancyLebovitz
What proportion of top people at SIAI love sf? It's at least plausible that strong weirdness filters interfere with creativity. On the other hand, weirdness is hard to define-- sf is a rather common sort of weirdness these days..
1John_Maxwell
There is no reason to turn them off right away. The blog itself is weird enough. Maybe they will be acclimated, which would be good.
1John_Maxwell
I almost forgot this, but I was pretty put off by the 12 virtues as well when I first came across it on reddit at age 14 or so. My reaction was something like "you're telling me I should be curious? What if I don't want to be curious, especially about random stuff like Barbie dolls or stamp collecting?" I think I might have almost sent Eliezer an e-mail about it. When you put this together with what Eliezer called "the bizarre "can't get crap done" phenomenon that afflicts large fractions of our community, which he attributes to feelings of low status, this paints a picture of LW putting off the sort of person who is inclined to feel high status (and is therefore good at getting crap done, but doesn't like being told what to do). This may be unrelated to the cult issue. Of course, these hypothetical individuals who are inclined to feel high status might not like being told how to think better either... which could mean that Less Wrong is not their cup of tea under any circumstances. But I think it makes sense to shift away from didacticism on the margin.

Ironically, I suspect the "cultlike" problem is that LessWrong/SI's key claims lack falsifiability.

Friendly AI? In the far future.

Self-improvement? All mental self-improvement is suspected of being a cult, unless it trains a skill outsiders are confident they can measure.

If I have a program for teaching people math, outsiders feel they know how they can check my claims - either my graduates are good at math or not.

But if I have a program for "putting you in touch with your inner goddess", how are people going to check my claims? For all... (read more)

2roryokane
PredictionBook might help with measuring improvement, in a limited way. You can use it to measure how often your predictions are correct, and whether you are getting better over time. And you could theoretically ask LW-ers and non-LW-ers to make some predictions on PredictionBook, and then compare their accuracy to see if Less Wrong helped. Making accurate predictions of likelihood is a real skill that certainly has the possibility to be very useful – though it depends on what you’re predicting.

you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency.

Maybe a substantial number of people are searching for the posts about cultishness.

so I shouldn't refer people to death spirals and baby eating right away?

3antigonus
Don't mindkill their cached thoughts.
2Kaj_Sotala
Offer them a hamster in a tutu first, that'll be cute and put them at ease.

I think the biggest reason Less Wrong seems like a cult is because there's very little self-skepticism; people seem remarkably confident that their idiosyncratic views must be correct (if the rest of the world disagrees, that's just because they're all dumb). There's very little attempt to provide any "outside" evidence that this confidence is correctly-placed (e.g. by subjecting these idiosyncratic views to serious falsification tests).

Instead, when someone points this out, Eliezer fumes "do you know what pluralistic ignorance is, and Asch... (read more)

3John_Maxwell
Offhand, can you think of a specific test that you think ought to be applied to a specific idiosyncratic view? ---------------------------------------- My read on your comment is: LWers don't act humble, therefore they are crackpots. I agree that LWers don't always act humble. I think it'd be a good idea for them to be more humble. I disagree that lack of humility implies crackpottery. In my mind, crackpottery is a function of your reasoning, not your mannerisms. Your comment is a bit short on specific failures of reasoning you see--instead, you're mostly speaking in broad generalizations. It's fine to have general impressions, but I'd love to see a specific failure of reasoning you see that isn't of the form "LWers act too confident". For example, a specific proposition that LWers are too confident in, along with a detailed argument for why. Or a substantive argument for why SI's approach to AI is "extremely dangerous". (I personally know pretty much everyone who works for SI, and I think there's a solid chance that they'll change their approach if your argument is good enough. So it might not be a complete waste of time.) Now it sounds like you're deliberately trying to be be inflammatory ಠ_ಠ
4aaronsw
Well, for example, if EY is so confident that he's proven "MWI is obviously true - a proposition far simpler than the argument for supporting SIAI", he should try presenting his argument to some skeptical physicists. Instead, it appears the physicists who have happened to run across his argument found it severely flawed. How rational is it to think that you've found a proof most physicists are wrong and then never run it by any physicists to see if you're right? I do not believe that. As for why SI's approach is dangerous, I think Holden put it well in the most upvoted post on the site. I'm not trying to be inflammatory, I just find it striking.

it appears the physicists who have happened to run across his argument found it severely flawed

The criticisms at those links have nothing to do with the argument for MWI. They are just about a numerical mistake in an article illustrating how QM works.

The actual argument for MWI that is presented is something like this: Physicists believe that the wavefunction is real and that it collapses on observation, because that is the first model that explained all the data, and science holds onto working models if they are falsified. But we can also explain all the data by saying that the wavefunction is real and doesn't collapse, if we learn to see the wavefunction as containing multiple worlds that are equally real. The wavefunction doesn't collapse, it just naturally spreads out into separate parts and what we see is one of those separate parts. A no-collapse theory is simpler than a collapse theory because it has one less postulate, so even though there are no new predictions, by Bayes (or is it Occam?) we can favor the no-collapse theory over the collapse theory. Therefore, there are many worlds.

This is informal reasoning about which qualitative picture of the world to favor, so i... (read more)

9Eliezer Yudkowsky
BTW, it's important to note that by some polls an actual majority of theoretical physicists now believe in MWI, and this was true well before I wrote anything. My only contributions are in explaining the state of the issue to nonphysicists (I am a good explainer), formalizing the gross probability-theoretic errors of some critiques of MWI (I am a domain expert at that part), and stripping off a lot of soft understatement that many physicists have to do for fear of offending sillier colleagues (i.e., they know how incredibly stupid the Copenhagen interpretation appears nowadays, but will incur professional costs from saying it out loud with corresponding force, because there are many senior physicists who grew up believing it). The idea that Eliezer Yudkowsky made up the MWI as his personal crackpot interpretation isn't just a straw version of LW, it's disrespectful to Everett, DeWitt, and the other inventors of MWI. It does seem to be a common straw version of LW for all that, presumably because it's spontaneously reinvented any time somebody hears that MWI is popular on LW and they have no idea that MWI is also believed by a plurality and possibly a majority of theoretical physicists and that the Quantum Physics Sequence is just trying to explain why to nonphysicists / formalize the arguments in probability-theoretic terms to show their nonambiguity.

by some polls

The original source for that "58%" poll is Tipler's The Physics of Immortality, where it's cited (chapter V, note 6) as "Raub 1991 (unpublished)". (I know nothing about the pollster, L. David Raub, except that he corresponded with Everett in 1980.) Tipler says that Feynman, Hawking, and Gell-Mann answered "Yes, I think the MWI is true", and he lists Weinberg as another believer. But Gell-Mann's latest paper is a one-history paper, Weinberg's latest paper is about objective collapse, and Feynman somehow never managed to go on record anywhere else about his belief in MWI.

2[anonymous]
I trust Tipler as far as I can throw his book. (It's a large book, and I'm not very strong.)
9aaronsw
Has anyone seriously suggested you invented MWI? That possibility never even occurred to me.
9Eliezer Yudkowsky
It's been suggested that I'm the one who invented the idea that it's obviously true rather than just one more random interpretation; or even that I'm fighting a private war for some science-fiction concept, rather than being one infantry soldier in a long and distinguished battle of physicists. Certainly your remark to the extent that "he should try presenting his argument to some skeptical physicists" sounds like this. Any physicist paying serious attention to this issue (most people aren't paying attention to most things most of the time) will have already heard many of the arguments, and not from me. It sounds like we have very different concepts of the state of play.
0Shmi
Can't help but compare this to the Swiftian battle of big-endians and little-endians, only the interpretational war makes even less sense.
6Quantumental
I just can't ignore this. If you take a minute to actually look at the talk section of that wikipedia page you will see those polls being thorn to pieces. David Deutsch himself has stated that less than 10% of the people doing quantum fundamentals believe in MWI and then within that minority there are a lot of diverging views. So this is still not by any means a "majority interpretation". As Mitchell_Porter has pointed out Gell-Mann certainly do not believe in MWI. Nor do Steven Weinberg, he denounced his 'faith' in it in a paper last year. Feynman certainly did never talk about it, which to me is more than enough indication that he did not endorse it. Hawking is a bit harder, he is on record seemingly being pro and con it, so I guess he is a fence sitter. But more importantly is the fact that none of the proponents agree on what MWI they support. (This includes you Eliezer) Zurek is another fence sitter, partly pro-some-sort-of-MWI, partly pro-It-from-Bit. Also his way of getting the Born Rule in MWI is quite a bit different. From what I understand, only the worlds that are "persistent" are actualized. This reminds me of Robin Hanson's mangled worlds where only some worlds are real and the rest gets "cancelled" out somehow. Yet they are completley different ways of looking at MWI. Then you got David Deutsch's fungible worlds which is slightly different from David Wallace's worlds. Tegmark got his own views etc. There seems to be no single MWI and there has been no answer to the Born Rule. So I want to know why you keep on talking about it as it is a slam dunk?
3Peterdjones
Good question.
5fezziwig
I think your use of "believe in" is a little suspect here. I'm willing to believe that more than half of all theoretical physicists believe some variant of the MWI is basically right (though the poll can't have been that recent if Feynman was part of it, alas), but that's different from the claim that there are no non-MWI interpretations worth considering, which is something a lot of people, including me, seem to be taking from the QP sequence. Do you believe that that's a majority view, or anything close to one? My impression is that that view is very uncommon, not just in public but in private too...at least outside of Less Wrong.
-1Eliezer Yudkowsky
That sounds correct to me. A physicist who also possesses probability-theory expertise and who can reason with respect to Solomonoff Induction and formal causal models should realize that single-world variants of MWI are uniformly unworkable (short of this world being a runtime-limited computer simulation); but such is rare (though not unheard-of) among professional physicists; and among the others, you can hardly blame them for trying to keep an open mind.
4Shmi
The Penrose's objective collapse theory saying that the entanglement scale is limited by gravity, which results in the macroscopic objects remaining essentially classical, does not look all that unworkable.
5Eliezer Yudkowsky
It'd still be the only FTL discontinuous non-differentiable non-CPT-symmetric non-unitary non-local-in-the-configuration-space etc. etc. process in all of physics, to explain a phenomenon (why do we see only one outcome?) that doesn't need explaining.
2Shmi
Well, one advantage of it is that it is testable, and so is not a mere interpretation, which holds a certain amount of appeal to the more old-fashioned of us.
5Eliezer Yudkowsky
I agree, and I myself was, and am still, sentimentally fond of Penrose for this reason, and I would cheer on any agency that funded a test. However and nonetheless, "testable" is not actually the same as "plausible", scientifically virtuous as it may be.
-4V_V
Not if it doesn't allow FTL communication, unless you want to argue that quantum entanglement is a FTL phenomenon, but that wouldn't be an issue of the particular interpretation. Not necessarily. Irreversible and stochastic quantum processes can be time-continuous and time-differentiable. Consider the processes described by the Lindblad equation, for instance. CPT symmetry is a property of conventional field theories, not all quantum theories necessarily have it, and IIUC, there are ongoing experiments to search for violations. CPT symmetry is just the last of a series of postulated symmetries, the previous ones (C symmetry, P symmetry, T symmetry and CP symmetry) have been experimentally falsified. Right, and that's the point of objective collapse theories. I'm not sure what you mean by that, but locality in physics is defined with respect to space and time, not to arbitrary configuration spaces. AFAIK, there have been attempts to derive the Born rule in Everett's interpretation, but they didn't lead to uncontroversial results.
1Luke_A_Somers
I have never seen a proposed mechanism of ontological collapse that actually fits this, though. The inability to send a signal that you want, getting instead a Born-Rule-based pure random signal, doesn't change that this Born-Rule-based pure random signal is, under ontological collapse distributed FTL.
-1V_V
AFAIK, Penrose's interpretation doesn't describe the details of the collapse process, it just says that above about the "one graviton" level of energy separation collapse will occur. It doesn't commit to collapse being instantaneous: It could be that the state evolution is governed by a non-linear law that approximates very well the linear Schrödinger equation in the "sub-graviton" regime and has a sharp, but still differentiable phase transition when approaching the "super-graviton" regime. The GRW interpretation assumes instantaneous collapse, IIUC, but it would be a trivial modification to have fast, differentiable collapse. My point is that non-differentiable collapse is not a requirement of objective collapse interpretations. But that's an issue of QM, irrespective of the particular interpretation. Indeed the "spooky action at distance" bugged Einstein and many people of his time, but the modern view is that as long as you don't have causal influences (that is, information transmission) propagating FTL, you don't violate special relativity.
1Luke_A_Somers
No, it isn't. QM is purely causal and relativistic. You can look into the equations and prove that nothing FTL is in there. The closest you get is accounting for the possibility of a vacuum bubble having appeared nearby a particle with exactly its energy, and the antimatter part of it the bubble then cancels with the particle. And that isn't much like FTL. When you do an EPR experiment, the appearance of FTL communication arises from the assumption that the knowledge you gain about what you'll see if you go check the other branch of the experiment is something happens at the other end of the experiment, instead of locally, with the information propagating to the other end of the experiment as you go to check. The existence of nonlocal states does not imply nonlocal communication.
0V_V
I'm not sure what we are disagreeing about. My point is that objective collapse is FTL only in the same sense that QM is. That is, if QM isn't FTL, then collapse isn't.
0V_V
I'm puzzled. What does Solomonoff Induction have to say about experimentally undistinguishable (as far as we can practically test, at least) interpretations of the same theory?
-2Peterdjones
But there is s case to be made for relatioal QM as superior to both MWI an collpase interpretations. I have metuioned it several times. I am still waiting to hear back.
0Mitchell_Porter
Relational QM is gibberish. Whether the cat is dead or alive is "relative to the observer". How could that make sense except via many worlds?
0Peterdjones
It makes sense the way rQM says: there is no non-relational state, so there is not answer to "is the cat dead or alive (absent an observer)". Since rQM says there is no state, you don't disprove it by insisting there is state. BTW, there is no simultaneity either.
0Mitchell_Porter
OK, so suppose we have an observer. Now look at the cat. Is it alive or dead? If it is alive and only alive, well, we can affix the phrase "relative to the observer" but it doesn't diminish the absoluteness of the cat's being alive. But if the cat is alive "relative to one observer to which it is alive", and dead "relative to another observer to which it is dead", how can we possibly make sense of that except in many-worlds fashion, by saying there are two cats and two observers?
0Peterdjones
If two observers measure a cat, they will get compatible results. However one observer can have less complete information ("the cat collapsed") and another more complete ("the cat is uncollapsed"). Observers can disagree about "collapse" because that is just an issue of their information, not an objective property. "Relational interpretation The relational interpretation makes no fundamental distinction between the human experimenter, the cat, or the apparatus, or between animate and inanimate systems; all are quantum systems governed by the same rules of wavefunction evolution, and all may be considered "observers." But the relational interpretation allows that different observers can give different accounts of the same series of events, depending on the information they have about the system.[11] The cat can be considered an observer of the apparatus; meanwhile, the experimenter can be considered another observer of the system in the box (the cat plus the apparatus). Before the box is opened, the cat, by nature of it being alive or dead, has information about the state of the apparatus (the atom has either decayed or not decayed); but the experimenter does not have information about the state of the box contents. In this way, the two observers simultaneously have different accounts of the situation: To the cat, the wavefunction of the apparatus has appeared to "collapse"; to the experimenter, the contents of the box appear to be in superposition. Not until the box is opened, and both observers have the same information about what happened, do both system states appear to "collapse" into the same definite result, a cat that is either alive or dead." - WP
2Mitchell_Porter
In the interpretation of QM, one of the divides is between ontic and epistemic interpretations of the wavefunction. Ontic interpretations of the wavefunction treat it as a thing, epistemic interpetations as an incomplete description or a tabulation of uncertainty, just like a probability distribution. In the relational interpretation of QM, are the states understood as ontic or as epistemic? The passage you quote makes them sound epistemic: the cat knows but the observer outside the box doesn't, so the observer outside the box uses a different wavefunction. That undoubtedly implies that the wavefunction of the observer outside the box is epistemic, not ontic; the cat knows something that the outside observer doesn't, an aspect of reailty which is already definite even though it is not definite in the outside observer's description. Or at least, this ought to imply that quantum state in the relational interpretation are epistemic. However, this is never explicitly stated, and instead meaningless locutions are adopted which make it sound as if the quantum states are to be regarded as ontic, but "relative". There are certain very limited senses in which it makes sense to say that the state of something is relative. For example, we may be floating in space, and what is up to you may be down to me, so whether one object is above another object may be relative to an observer. But clearly such a dodge will not work for something like Schrodinger's cat. Either the cat is alive, dead, both, or neither. It can't be "alive for one observer and dead for another" and still be just one cat. But that is the ontological implication one gets, if "relational QM' is interpreted as an ontic interpertation. On the other hand, if it is an epistemic interpretation, then it still hasn't answered the question, "what is the nature of reality? what is the physical ontology behind the formalism and the instrumental success?"
0Peterdjones
It can't in rQM: "However, the comparison does not lead to contradiction because the comparison is itself a physical process that must be understood in the context of quantum mechanics. Indeed, O′ can physically interact with the electron and then with the l.e.d. (or, equivalently, the other way around). If, for instance, he finds the spin of the electron up, quantum mechanics predicts that he will then consistently find the l.e.d. on (because in the first measurement the state of the composite system collapses on its [spin up/l.e.d. on] component). That is, the multiplicity of accounts leads to no contradiction precisely because the comparison between different accounts can only be a physical quantum interaction. This internal self-consistency of the quantum formalism is general, and it is perhaps its most remarkable aspect. This self consistency is taken in relational quantum mechanics as a strong indication of the relational nature of the world"--SEP
9Eliezer Yudkowsky
There were plenty of physicists reading those posts when they first came out on OB (the most famous name being Scott Aaronson). Some later readers have indeed asserted that there's a problem involving a physically wrong factor of i in the first couple of posts (i.e. that's allegedly not what a half-silvered mirror does to the phase in real life), which I haven't yet corrected because I would need to verify with a trusted physicist that this was correct, and then possibly craft new illustrations instead of using the ones I found online, and this would take up too much time relative to the point that talking about a phase change of -1 instead of i so as to be faithful to real-world mirrors is an essentially trivial quibble which has no effect on any larger points. If anyone else wants to rejigger the illustration or the explanation so that it flows correctly, and get Scott Aaronson or another known trusted physicist to verify it, I'll be happy to accept the correction. Aside from that, real physicists haven't objected to any of the math, which I'm actually pretty darned proud of considering that I am not a physicist.

There were plenty of physicists reading those posts when they first came out on OB (the most famous name being Scott Aaronson)

As Scott keeps saying, he's not a physicist! He's a theoretical computer scientist with a focus on quantum computing. He clearly has very relevant expertise, but you should get his field right.

3Quantumental
I still wonder why you haven't written a update in 4 years regarding this topic. Especially in regards to the Born Rule probability not having a solution yet + the other problems. You also have the issue of overlap vs non-overlapping of worlds, which again is a relevant issue in the Many Worlds interpretation. Overlap = the typical 1 world branching into 2 worlds. Non-overlap = 2 identical worlds diverging (Saunders 2010, Wilson 2005-present) Also I feel like the QM sequence is a bit incomplete when you do not give any thought to things like Gerard 't Hoofts proposal of a local deterministic reality giving rise to quantum mechanics from a cellular automaton at the planck scale? It's misleading to say the MWI is "a slam dunk" winner when there are so many unanswered questions. Mitchell Porter is one of the few persons here who seem to have a deep understanding of the subject before reading your sequence, so he has raised some interesting points...
0John_Maxwell
I agree that EY is probably overconfident in MWI, although I'm uniformed about QM so I can't say much with confidence. I don't think it's accurate to damn all of Less Wrong because of this. For example, this post questioning the sequence was voted up highly. I don't think EY claims to have any original insights pointing to MWI. I think he's just claiming that the state of the evidence in physics is such that MWI is obviously correct, and this is evidence as to the irrationality of physicists. I'm not too sure about this myself. Well there have been responses to that point (here's one). I wish you'd be a bit more self-skeptical and actually engage with that (ongoing) debate instead of summarizing your view on it and dismissing LW because it largely disagrees with your view.
8aaronsw
It seems a bit bizarre to say I've dismissed LessWrong given how much time I've spent here lately.
0John_Maxwell
Fair enough.
-1MugaSofer
Your examples seem ... how do I put this ... unreliable. The first two are less examples and more insults, since you do not provide any actual examples of these tendencies; the last one would be more serious, if he hadn't written extensively on why he believes this to be the safest way - the only way that isn't suicidal - or if you had provided some evidence that his FAI proposals are "extremely dangerous". And, of course, airily proclaiming that this is true of "pretty much every entry in the sequences" seems, in the context of these examples, like an overgeneralization at best and ... well, I'm not going to bother outlining the worst possible interpretation for obvious reasons.

It's like Greeks trying to do physics by pure reasoning. They got atoms right because of salt crystallizing,

Obviously, observeing salt is not prure reasoning. Very little philsophy is pure reasoning, the salient distinction is between informal, everyday observation and deliberately arranged experiements.

It's a rather unavoidable side-effect of claiming that you know the optimal way to fulfill one's utility function, especially if that claim sounds highly unusual (Unite the human species in making a friendly AI that will create Utopia). There are many groups that make such claims, and either one or none of them can be right. Most people(Who haven't already bought into a different philosophy of life) think it's the later, and thus tend not to take someone seriously when they make extraordinary claims.

Until recognition of the Singularity's imminence and ne... (read more)

3Voltairina
Is the goal of the community really to get everyone into the one task of creating FAI? I'm kind of new here, but I'm personally interested in a less direct but maybe more certain (I don't know the hard numbers) (but, I feel, its synergistic), goal of achieving a stable post-scarcity economy which could free up a lot more people to become hackers/makers of technology and participate in the collective commons, but I'm interested in FAI and particularly machine ethics, and I hang out here because of the rationality and self improvement angles. In fact I got into my current academic track (embedded systems) because I'm interested in robotics and embodied intelligence, and probably got started reading Hofstadter stuff and trying to puzzle out how minds work. "Come for the rationality... stay for the friendly AI" maybe?
6Grognor
Please don't talk about 'the' goal of the community as if there's only one. There are many.
0Voltairina
That's what I was wondering, thank you for providing the link to that post. I wasn't sure how to read Locke's statement.

There doesn't seem to be anyone arguing seriously that Less Wrong is a cult

Well, there's the folks at RationalWiki.

Long time lurker, I think LW is not capable enough as a social unit to handle it's topic and I currently view that participating in LW is not a good way to efficiently drive it's goals.

In order to reach a (hostile) audience one needs to speak the language. However ambient ways of carrying out discussion are often intermingling status / identity / politics with epistemology. In order to forward a position that biased / faith / economy based thinking are not epistemologically efficient tools one needs to make at least the initial steps in this twisted up &qu... (read more)

Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?

Yes. I know a couple of people with whom I share interest in Artificial Intelligence (this is my primary focus in loading Less Wrong web pages) who communicated to me that they did not like the site's atmosphere. Atmosphere is not exactly the word they used. One person thought the cryonics was a deal breaker. (If you read the piece in the New York Times Sunday Magazine about Robin Hanso... (read more)

What paper or text should I read to convince me y'all want to get to know reality? That's a sincere question, but I don't know how to say more without being rude (which I know you don't mind).

Put another way: What do you think Harry Potter (of HPMOR) would think of the publications from the Singularity Institute? (I mean, I have my answer).

I got a possible proto-small religion feeling from SL4 discussions with Eliezer and SI folk back in the day. Any possible cultishness feeling was with a small c, that is not harmful to the participants accept for their bank balance, as in the use of the word cult in cult following. There isn't a good word for this type of organization, which is why it gets lumped in with Cults.

Less wrong is better than SL4 for this feeling anyway.

Well, it's nice to know at least you guys see it. Yes, that was one of my reactions. I started reading some of the sequences (which really aren't put at a level that the mass public, or, I'd hazard to say, though not with certainty, even people whose IQs don't fall one standard deviation above the mean or higher can easily understand). I liked them, though I didn't fully understand them, and have referred people to them. However, at the time I was looking into a job and did some kind of search through the website. Anyways, I encountered a post with a perso... (read more)

The c-word is too strong for what LW actually is. But "rational" is not a complete descriptor either.

It is neither rational nor irrational to embrace cryonics. It may be rational to conclude that someone who wants to live forever and believes body death is the end of his life will embrace cryonics and life extension technologies.

It is neither rational nor irrational to vaunt current human values over any other. It is most likely that current human values are a snapshot in the evolution of humans, and as such are an approximate optimum in a... (read more)

-1John_Maxwell
Dunno bout you, but I value my values.
2mwengler
I think I have the same emotional response to "wrong" things as most people. The knowledge that this is bred in to me by natural selection sorta takes the wind out of my rationalizations of these feelings in two ways. 1) Although they "feel" like right and wrong, I realize they are just hacks done by evolution. 2) If evolution has seen fit to hack our values in the past to keep us outsurviving others, than it stands to reason that the "extrapolated" values of humanity are DIFFERENT from the "evolved" values of humanity. So no matter how Coherent our Extrapolation of Values will be, it will actually subvert whatever evolution might do to our race. So once we have an FAI with great power and a sense of CEV, we stop evolving. Then we spend the rest of eternity relatively poorly adapted for the environment we are in, with FAI scarmbling to make it alright for us. Sounds like the cluster version of wireheading in a way. On the other hand, I suppose I value the modifications that occur to us through evolution and natural selection. Presumably an attempt at CEV would build that in and perhaps the FAI would decide to leave us alone. Don't we keep reading sci fi where that happens?

Edit: Nevermind, my point was poorly thought out and hastily formulated.

[-]Eneasz-10

This comment will be heavy with jargon, to convey complex ideas with the minimum required words. That is what jargon is for, after all. The post's long enough even with this shortening.

Less Wrong inspires a feeling of wonder.

To see humans working seriously to advance the robot rebellion is inspiring. To become better, overcome the programming laid in by Azathoth and actually improve our future.

The audacity to challenge death itself, to reach for the stars, is breathtaking. The piercing insight in many of the works here is startling. And the gift of being a... (read more)

2Jakeness
I think this post can be modified, without much effort, to defend any pseudo-cult, or even a cheesy movie.