Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Cult impressions of Less Wrong/Singularity Institute

28 Post author: John_Maxwell_IV 15 March 2012 12:41AM
If you type "less wrong c" or "singularity institute c" into Google, you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency. (EDIT: Please avoid testing this out, so Google doesn't autocomplete your search and reinforce their positions. This kind of problem can be hard to get rid of. Click these instead: less wrong cult, singularity institute cult.)
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

I have several questions related to this:

  • Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
  • If so, can you suggest any easy steps we could take?
  • Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
  • Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
  • Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.

If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Here are the worst violators I see on that about page:

Some people consider the Sequences the most important work they have ever read.

Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.

Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]

And on the sequences page:

If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.

This seems obviously false to me.

These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.

Comments (247)

Comment author: CarlShulman 15 March 2012 01:15:27AM *  26 points [-]

The top autocompletes for "Less Wrong" are

  • sequences
  • harry potter
  • meetups

These are my (logged-in) Google results for searching "Less Wrong_X" for each letter of the alphabet (some duplicates appear):

  • akrasia
  • amanda knox
  • atheism
  • australia
  • blog
  • bayes
  • basilisk
  • bayes theorem
  • cryonics
  • charity
  • cult
  • discussion
  • definition
  • decoherence
  • decision theory
  • epub
  • evolutionary psychology
  • eliezer yudkowsky
  • evidence
  • free will
  • fanfiction
  • fanfic
  • fiction
  • gender
  • games
  • goals
  • growing up is hard
  • harry potter
  • harry potter and the methods of rationality
  • how to be happy
  • hindsight bias
  • irc
  • inferential distance
  • iq
  • illusion of transparency
  • joint configurations
  • joy in the merely real
  • kindle
  • amanda knox
  • lyrics
  • luminosity
  • lost purposes
  • leave a line of retreat
  • meetup
  • mobi
  • meditation
  • methods of rationality
  • newcomb's problem
  • nyc
  • nootropics
  • neural categories
  • optimal employment
  • overcoming bias
  • open thread
  • outside the laboratory
  • procrastination
  • pdf
  • polyamory
  • podcast
  • quantum physics
  • quotes
  • quantum mechanics
  • rationality quotes
  • rationality quotes
  • rationalwiki
  • reading list
  • rationality
  • sequences
  • survey
  • survey results
  • sequences pdf
  • twitter
  • textbooks
  • three worlds collide
  • toronto
  • ugh fields
  • universal fire
  • value is fragile
  • village idiot
  • wiki
  • wikipedia
  • words
  • what is evidence
  • yudkowsky
  • yvain
  • your strength as a rationalist
  • your rationality is my business
  • zombies
  • zombies the movie

The autocomplete bit doesn't seem to be too big a problem for Less Wrong.

However, it is one of the immediate autocompletes for "Singularity Institute." What pages do you get on the first page of results if you search "singularity institute cult"? I see the wikipedia page, the SI website, Michael Anissimov's blog, RationalWiki, Less Wrong posts about cultishness and death spirals, Lukeprog's blog, a Forbes article mention of "cargo-cult enthusiasm," and at the bottom a blog post making a case against SI and other transhumanist organizations.

Comment author: timtyler 15 March 2012 11:16:11AM 0 points [-]

Luke's link to How Cults work is pretty funny.

Comment author: JoshuaFox 15 March 2012 10:12:30AM *  23 points [-]

Eliezer addressed this in part with his "Death Spiral" essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:

  • Having a house where core members live together.
  • Asking followers to completely adjust their thinking processes to include new essential concepts, terminologies, and so on to the lowest level of understanding reality.
  • Claiming that only if you carry out said mental adjustment can you really understand the most important parts of the organization's philosophy.
  • Asking for money for a charity, particularly one which does not quite have the conventional goals of a charity, and claiming that one should really be donating a much larger percentage of one's income than most people donate to charity.
  • Presenting an apocalyptic scenario including extreme bad and good possibilities, and claiming to be the best positioned to deal with it.
  • [Added] Demand you leave any (other) religion.

Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one's approach just for appearances.

Comment author: timtyler 15 March 2012 11:25:32AM *  15 points [-]

Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.

Oh yes, and fact that the leader wants to SAVE THE WORLD.

Comment author: Bongo 15 March 2012 10:00:32PM *  5 points [-]

fraction of revenue that ultimately goes to paying staff wages

About a third in 2009, the last year for which we have handy data.

Comment author: timtyler 16 March 2012 12:49:18AM *  5 points [-]

Practically all of it goes to them or their "associates" - by my reckoning. In 2009 some was burned on travel expenses and accomodation, some was invested - and some was stolen.

Who was actually helped? Countless billions in the distant future - supposedly.

Comment author: dbaupp 16 March 2012 02:50:19AM *  5 points [-]

all of it goes to them or their "associates"

What else should it go to? (Under the assumption that SI's goals are positive.)

As Larks said above, they are doing thought work: they are not trying to ship vast quantities of food or medical supplies. The product of SI is the output from their researchers, the only way to get more output is to employ more people (modulo improving the output of the current researchers, but that is limited).

Comment author: timtyler 16 March 2012 11:19:01AM *  7 points [-]

So, to recap, this is a proposed part of a list of ways in which the SIAI resembles a cult. It redistribtutes economic resources from the "rank and file" members up the internal heirarchy without much expenditure on outsiders - just like many cults do.

Comment author: dbaupp 16 March 2012 01:12:02PM *  4 points [-]

(Eh. Yes, I think I lost track of that a bit.)

Keeping that in mind: SI has a problem because acting to avoid appearing to exist to give money to the upper ranks means that they can't pay their researchers. There are three broad classes of solutions to this (that I can see):

  • Give staff little to no compensation for their work
  • Use tricky tactics to try to conceal how much money goes to the staff
  • Try to explain to everyone why such a large proportion of the money goes to the staff

All of those seem suboptimal.

Comment author: epicureanideal 16 March 2012 02:38:33AM *  5 points [-]

Why was this downvoted instead of responded to? Downvoting people who are simply stating negative impressions of the group doesn't improve impressions of the group.

Comment author: JoshuaFox 15 March 2012 12:11:18PM *  3 points [-]

Most organizations spend most of their money on staff. What else could you do with it? Paying fellowships for "external staff" is a possibility. But in general, good people are exactly what you need.

Comment author: timtyler 15 March 2012 12:25:57PM *  2 points [-]

Often goods or needy beneficiaries are also involved. Charity actions are sometimes classified into:

  • Program Expenses
  • Administrative Expenses
  • Fundraising Expenses

This can be used as a heuristic for identifying good charities.

Not enough in category 1 and too much in categories 2 and 3 is often a bad sign.

Comment author: Larks 16 March 2012 01:07:52AM *  8 points [-]

But they're not buying malaria nets, they're doing thought-work. Do you expect to see an invoice for TDT?

Quite appart from the standard complaint about how awful a metric that is.

Comment author: epicureanideal 16 March 2012 02:37:34AM 2 points [-]

As I've discussed with several LWers in person, including some staff and visiting fellows, one of the things I disliked about LW/SIAI was that so much of the resources of the organization go to pay the staff. They seemingly wouldn't even consider proposals to spend a few hundred dollars on other things because they claimed it was "too expensive".

Comment author: MTGandP 08 September 2012 08:38:10PM 2 points [-]

One fundamental difference between LW and most cults is that LW tells you to question everything, even itself.

Comment author: gwern 08 September 2012 09:25:13PM 3 points [-]

Most, but not all. The Randians come to mind. Even the Buddha encouraged people to be critical, but doesn't seem to have stopped the cults. I was floored to learn a few weeks ago that Buddhism has formalized even when you stop doubting! When you stop doubting, you become a Sotāpanna; a Sotāpanna is marked by abandoning '3 fetters', the second fetter according to Wikipedia being

Skeptical Doubt - Doubt about the Buddha and his teaching is eradicated because the Sotāpanna personally experiences the true nature of reality through insight, and this insight confirms the accuracy of the Buddha’s teaching.

As well, as unquestioningness becomes a well known trait of cults, cults tend to try to hide it. Scientology hides the craziest dogmas until you're well and hooked, for example.

Comment author: MTGandP 08 September 2012 09:28:58PM 1 point [-]

So there comes a point in Buddhism where you're not supposed to be skeptical anymore. And Objectivists aren't supposed to question Ayn Rand.

Comment author: Mitchell_Porter 09 September 2012 12:42:18AM 1 point [-]

Would it be productive to be skeptical about whether your login really starts with the letter "M"? Taking an issue off the table and saying, we're done with that, is not in itself a bad sign. The only question is whether they really do know what they think they know.

I personally endorse the very beginning of Objectivist epistemology - I mean this: "Existence exists—and the act of grasping that statement implies two corollary axioms: that something exists which one perceives and that one exists possessing consciousness, consciousness being the faculty of perceiving that which exists." It's the subsequent development which is a mix of further gemlike insights, paths not taken, and errors or uncertainties that are papered over.

In the case of Buddhism, one has the usual problem of knowing, at this historical distance, exactly what psychological and logical content defined "enlightenment". One of its paradoxes is that it sounds like the experience of a phenomenological truth, and yet the key realization is often presented as the discovery that there is no true self or substantial self. I would have thought that achieving reflective consciousness implied the existence of a reflector, just as in the Objectivist account. Then again, reflection can also produce awareness that traits with which you have identified yourself are conditioned and contingent, so it can dissolve a naive concept of self, and that sounds more like the Buddhism we hear about today. The coexistence of a persistent observing consciousness, and a stream of transient identifications, in certain respects is like Hinduism; though the Buddhists can strike back by saying that the observing consciousness is not eternal and free of causality, it too exists only if it has been caused to exist.

So claims to knowledge, and the existence of a stage where you no longer doubt that this really is knowledge, and get on with developing the implications, do not in themselves imply falsity. In a systematic philosophy based on reason, a description which covers Objectivism, Buddhism, and Less-Wrong-ism, there really ought to be some notion of a development that occurs as you as learn.

The alternative is Zen Rationalism: if you meet a belief on the road (of life), doubt it! It's a good heuristic if you are beset by nonsense, and it even has a higher form in phenomenological or experiential rationalism, where you test the truth of a proposition about consciousness by seeing whether you can plausibly deny it, even as the experience is happening. But if you do this, even while you keep returning to beginner's mind, you should still be dialectically growing your genuine knowledge about the nature of reality.

Comment author: [deleted] 04 January 2013 04:32:16PM 0 points [-]

If the Randians are a cult, LW is a cult.

Like the others, the members just think it's unique in being valid.

Comment author: Desrtopa 04 January 2013 05:02:07PM 1 point [-]

If a person disagrees with Rand about a number of key beliefs, do they still count as a Randian?

Comment author: Peterdjones 10 January 2013 12:40:07AM 0 points [-]
Comment author: [deleted] 07 January 2013 10:32:28PM 0 points [-]

That depends the largest part on what "a number of key beliefs" is.

Comment author: timtyler 15 March 2012 11:28:01AM *  1 point [-]

Not just a cult - an END OF THE WORLD CULT.

My favourite documentary on the topic: The End of The World Cult.

Comment author: TheAncientGeek 20 May 2014 01:33:44PM 1 point [-]

add

  • Leader(s) are credited with expertise beyond that convenrional experts in subjects they are not conventionally qualified in.

  • Studying conventional versions of subjects is deprecated in favour of in group versions.

Comment author: JoshuaFox 20 May 2014 06:03:48PM *  1 point [-]

Also:

Associated with non-standard and non-monogamous sexual practices.

(Just some more pattern-matching on top of what you see in the parent and grandparent comment. I don't actually think this is a strong positive indicator.)

Comment author: TheAncientGeek 20 May 2014 06:24:58PM 1 point [-]

The usual version of that indicator is "leader has sex with followers"

Comment author: timtyler 15 March 2012 11:34:24AM 0 points [-]

Focusing on an apocalyptic scenario including extreme bad and good possibilities.

There seems to be some detailed substructure there - which I go over here.

Comment author: jimrandomh 15 March 2012 01:35:17AM 19 points [-]

Google's autocomplete has a problem, which has produced controversy in other contexts: when people want to know whether X is trustworthy, the most informative search they can make is "X scam". Generally speaking, they'll find no results and that will be reassuring. Unfortunately, Google remembers those searches, and presents them later as suggestions - implying that there might be results behind the query. Once the "X scam" link starts showing up in the autocomplete, people who weren't really suspicious of X click on it, so it stays there.

Comment author: beoShaffer 15 March 2012 02:42:56AM 3 points [-]

Personal anecdote warning. I semi-routinely google the phrase "X cult" when looking into organizations.

Comment author: CarlShulman 15 March 2012 03:02:46AM 9 points [-]

Does this ever work?

Comment author: beoShaffer 15 March 2012 04:07:12PM *  3 points [-]

I think so, but it's hard to say. I look into organizations infrequently enough that semi-routinely leaves me a very small sample size. The one organization that had prominent cult results(not going to name it for obvious reasons) does have several worrying qualities. And they seem related to why it was called a cult. -edit minor grammar/style fix

Comment author: John_Maxwell_IV 15 March 2012 03:04:05AM 1 point [-]

Thanks; I updated the post to reflect this.

Comment author: Gabriel 15 March 2012 09:16:05PM 14 points [-]

Some things that might be problematic:

We use the latest insights from cognitive science, social psychology, probability theory, and decision theory to improve our understanding of how the world works and what we can do to achieve our goals.

I don't think we actually do that. Insights, sure, but latest insights? Also, it's mostly cognitive science and social psychology. The insights from probability and decision theory are more in the style of the simple math of everything.

Want to know if your doctor's diagnosis is correct? It helps to understand Bayes' Theorem.

This might sound weird to someone who hasn't already read the classic example about doctors not being able to calculate conditional probabilities. Like we believe Bayes theorem magically grants us medical knowledge or something.

[the link to rationality boot-camp]

I'm not a native speaker of English so I can't really tell, but I recall people complaining that the name 'boot-camp' is super creepy.

On the about page:

Introduce yourself to the community here.

That's not cultish-sounding but it's unnecessarily imperative. Introduction thread is optional.

Comment author: Douglas_Reay 15 March 2012 11:48:57AM 9 points [-]

There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

The LW FAQ says: >

Why do you all agree on so much? Am I joining a cult?

We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all. Yes, some of the results that fall out of these basics sound weird if you haven't seen the reasoning behind them, but there's nothing in the laws of physics that prevents reality from sounding weird.

I suspect that putting a more human face on the front page, rather than just linky text, would help.

Perhaps something like a YouTube video version of the FAQ, featuring two (ideally personable) people talking about what Less Wrong is and is not, and how to get started on it. For some people, seeing is believing. It is one thing to tell them there are lots of different posters here and we're not fanatics; but that doesn't have the same impact as watching an actual human with body cues talking.

Comment author: Shephard 16 March 2012 10:35:43PM 7 points [-]

I don't believe LW is a cult, but I can see where intelligent, critical thinking people might get that impression. I also think that there may be elitist and clannish tendencies within LW that are detrimental in ways that could stand to be (regularly) examined. Vigilance against irrational bias is the whole point here, right? Shouldn't that be embraced on the group level as much as on an individual one?

Part of the problem as I see it is that LW can't decide if it's a philosophy/science or a cultural movement.

For instance, as already mentioned, there's a great deal of jargon, and there's a general attitude of impatience for anyone not thoroughly versed in the established concepts and terminology. Philosophies and sciences also have this problem, but the widely accepted and respected philosophical and scientific theories have proven themselves to the world (and weren't taken very seriously until they did). I personally believe there's a lot of substance to the ideas here, but LW hasn't delivered anything dramatic to the world at large. Until it does so it may remain, in the eyes of outsiders, as some kind of hybrid of Scientology and Objectivism - an insular group of people with a special language, a revered spokesperson, and who claim to have "the answers".

If, however, LW is supposed to be a cultural movement, then I'm sorry, but ”ur doin it wrong". Cultural movements gain momentum by being inclusive and organic, and by creating a forum for people to express themselves without fear of judgment. Movements are bottom up, and LW often gives the impression of being top down.

I'm not saying that a choice has to be made or even can be made, merely that there are conflicting currents here. I don't know if I have any great suggestions. I guess the one thing I can say is that while I've observed (am observing) a lot of debate and self-examination internally, there's still a strong outward impression of having found “the answers”. Perhaps if this community presented itself a little more as a forum for the active practice of critical thinking, and a little less as the authoritative source for an established methodology for critical thinking.

And if that doesn't work, we could always try bus ads.

Comment author: Grognor 15 March 2012 02:37:17AM 7 points [-]

In general, I think we could stand more community effort being put into optimizing our about page, which you can do now here.

Thank you for this.

(In light of my other comment, I should emphasize that I really mean that. It is not sarcasm or any other kind of irony.)

Comment author: IlyaShpitser 15 March 2012 05:31:26PM *  18 points [-]

Here's what an outsider might see:

"doomsday beliefs" (something "bad" may happen eschatologically, and we must work to prevent this): check

a gospel (The Sequences): check

vigorous assertions of untestable claims (Everett interpretation): check

a charismatic leader extracting a living from his followers: check

is sometimes called a cult: check

This is enough to make up a lot of minds, regardless of any additional distinctions you may want to make, sadly.

Comment author: Gabriel 15 March 2012 09:24:45PM 4 points [-]

But an outsider would have to spend some time here to see all those things. If they think LW is accurately described by the c-word even after getting acquainted with the site, there might be no point in trying to change their minds. It's better to focus on people who are discouraged by first impressions.

Comment author: advancedatheist 18 March 2012 02:41:04AM *  3 points [-]

I recently read an article about Keith Raniere, the founder of a cult called NXIVM (pronounced "nexium"):

http://www.timesunion.com/local/article/Secrets-of-NXIVM-2880885.php

Raniere reminds me of Yudkowsky, especially after reading cult expert Rick Ross's assessment of Raniere:

Rick Ross has been a cult tracker for more than 25 years. He has examined and spoken about NXIVM so extensively it spawned an ongoing federal lawsuit from Raniere for publicizing portions of NXIVM's training program. That legal battle with NXIVM, where he is countersuing, is entering its ninth year. Ross has been qualified and accepted as an expert witness regarding cults and cultlike groups in the courts of 10 states and has been used by the federal government as a consultant. He has spent 50 to 100 hours talking with NXIVM members, he said, and additional time talking with ex-members, which is why he said he's confident in his view that Raniere is a cult leader. Ross has been retained by three former NXIVM members to help in deprogramming, and he has counseled several others, including one he said was sent into a psychotic episode from her NXIVM experience. "In my opinion, NXIVM is one of the most extreme groups I have ever dealt with in the sense of how tightly wound it is around the leader, Keith Raniere," Ross said in an interview. Ross was asked to provide insight on David Koresh to the federal government during the height of the Waco situation and says Raniere shows characteristics similar to Koresh. Like the infamous leader of the Branch Davidians, Ross said, Raniere thinks he knows a way to reorder human existence, believes he is on the cutting edge of the new wave of the future, has followers who see him as a savior and uses his position of power to gain sexual favors from women.

Comment author: gRR 15 March 2012 11:59:17AM *  14 points [-]

I'm here for only a couple of months, and I didn't have any impression of cultishness. I saw only a circle of friends doing a thing together, and very enthusiastic about it.

What I also did see (and still do) is specific people just sometimes being slightly crazy, in a nice way. As in: Eliezer's threatment of MWI. Or way too serious fear of weird acausal dangers that fall out of currently best decision theories.
Note: this impression is not because of craziness of the ideas, but because of taking them too seriously too early. However, the relevant posts always have sane critical comments, heavily upvoted.

I'm slightly more alarmed by posts like How would you stop Moore's Law?. I mean, seriously thinking of AI dangers is good. Seriously considering nuking Intel's fabs in order to stop the dangers is... not good.

Comment author: wedrifid 15 March 2012 02:03:55PM 2 points [-]

I saw only a circle of friends doing a thing together, and very enthusiastic about it.

That's a positive impression. People really look that enthusiastic and well bonded?

Comment author: gRR 15 March 2012 02:19:35PM 5 points [-]

Yes to well bonded. People here seem to understand each other far better than average on the net, and it is immediately apparent.

Enthusiastic is a wrong word, I suppose. I meant, sure of doing a good thing, happy to be doing it, etc, not in the sense of applauding and cheering.

Comment author: wedrifid 15 March 2012 02:35:02PM 4 points [-]

Yes to well bonded. People here seem to understand each other far better than average on the net, and it is immediately apparent.

Thankyou. It is good to be reminded that these things are relative. Sometimes I forget to compare interactions to others on the internet and instead compare them to interactions with people as I would prefer them to be or even just interactions with people I know in person (and have rather ruthlessly selected for not being annoying).

Comment author: Luke_A_Somers 15 March 2012 04:19:50PM 1 point [-]

Agreed, except the treatment of WMI does not seem the least bit crazy to me. But what do I know - I'm a crazy physicist.

Comment author: roystgnr 15 March 2012 05:24:57PM 4 points [-]

The conclusions don't seem crazy (well, they seem "crazy-but-probably-correct", just like even the non-controversial parts of quantum mechanics), but IIRC the occasional emphasis on "We Have The One Correct Answer And You All Are Wrong" rang some warning bells.

On the other hand: Rationality is only useful to the extent that it reaches conclusions that differ from e.g. the "just believe what everyone else does" heuristic. Yet when any other heuristic comes up with new conclusions that are easily verified, or even new conclusions which sound plausible and aren't disproveable, "just believe what everyone else does" quickly catches up. So if you want a touchstone for rationality in an individual, you need to find a question for which rational analysis leads to an unverifiable, implausible sounding answer. Such a question makes a great test, but not such a great advertisement...

Comment author: syzygy 15 March 2012 08:12:32AM 5 points [-]

I have seen this problem afflict other intellectually-driven communities, and believe me, it is a very hard problem to shake. Be grateful we aren't getting media attention. The adage, "All press is good press", has definitely been proven wrong.

Comment author: John_Maxwell_IV 16 March 2012 03:26:38AM *  0 points [-]

I assume that my post has aggravated things? :o(

Comment author: antigonus 15 March 2012 07:31:31AM *  5 points [-]

The word "cult" never makes discussions like these easier. When people call LW cultish, they are mostly just expressing that they're creeped out by various aspects of the community - some perceived groupthink, say. Rather than trying to decide whether LW satisfies some normative definition of the word "cult," it may be more productive to simply inquire as to why these people are getting creeped out. (As other commenters have already been doing.)

Comment author: [deleted] 07 August 2012 11:46:21PM 0 points [-]

This exactly. It's safe to assume that when most people say some organization strikes them as being cultish, they're not necessarily keeping a checklist.

Comment author: shminux 18 March 2012 04:15:43AM 4 points [-]

I got a distinct cultish vibe when I joined, but only from the far-out parts of the site, like UFAI, but not from the "modern rationality" discussions. When I raised the issue on #lesswrong, the reaction from most regulars was not very reassuring: somewhat negative and more emotional than rational. The same happened when I commented here. That's why I am looking forward to the separate rationality site, without the added untestable and useless to me EY's idiosyncrasies, such as the singularity, the UFAI and the MWI.

Comment author: jacoblyles 18 August 2012 07:56:19PM 1 point [-]

We should try to pick up "moreright.com" from whoever owns it. It's domain-parked at the moment.

Comment author: arborealhominid 28 July 2013 12:40:52AM 1 point [-]

Moreright.net already exists, and it's a "Bayesian reactionary" blog- that is, a blog for far-rightists who are involved in the Less Wrong community. It's an interesting site, but it strikes me as decidedly unhelpful when it comes to looking uncultish.

Comment author: RomeoStevens 15 March 2012 02:44:09AM *  4 points [-]

so I shouldn't refer people to death spirals and baby eating right away?

Comment author: antigonus 15 March 2012 03:17:02AM 2 points [-]

Don't mindkill their cached thoughts.

Comment author: Kaj_Sotala 15 March 2012 07:04:41AM 0 points [-]

Offer them a hamster in a tutu first, that'll be cute and put them at ease.

Comment author: CasioTheSane 15 March 2012 04:05:24AM *  18 points [-]

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.

For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.

AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.

There's also the problem of having a clearly defined leader, with strong controversial opinions which are treated like gospel. I would expect a community which discusses rationality to be more of an open debate/discussion between peers without any philosophical leaders that everybody agrees with. I don't see any easy solution here, because Eliezer Yudkowsky's reputation here is well earned- he actually is exceptionally brilliant and rational.

I would also like to see more articles on how to avoid bias, and apply bayesian methods to immediate present day problems and decision making. How can we avoid bias and correctly interpret data from scientific experiments, and then apply this knowledge to make good choices about things such as improving our own health?

Comment author: jsteinhardt 15 March 2012 05:02:59AM 9 points [-]

Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.

Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).

Comment author: beriukay 15 March 2012 04:52:17AM *  8 points [-]

Upvoted for sounding a lot like the kinds of complaints I've heard people say about LW and SIAI.

There is a large barrier to entry here, and if we want to win more, we can't just blame people for not understanding the message. I've been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:

  • We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren't exactly signalling openness or friendliness.

  • We have a tendency to preach to the converted. Or as the friend said:

    It's that classic mistake of talking in a way where you're convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.

    He brought up an example for how material might be introduced to newly exposed folk.

    If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

The curse of knowledge can be overcome, but it takes desire and some finesse.

  • If we intend to win the hearts and minds of the people (or at least make a mark in the greater world), we might want to work on evocative imagery that isn't immediately cool to futurists and technophiles and sci-fi geeks. Sure, keep the awesome stuff we have, but maybe look for metaphors that work in other domains. In my mind, ideally, we should build a database of ideas and their parallels in other fields (using some degree of field work to actually find the words that work). Eliezer has done some great work this way, like with HP:MoR, and some of his short stories. Maybe the SIAI could shell out money to fund focus groups and interviews a la Luntz, who in my mind is a great Dark Side example of winning.

Edit for formatting and to mention that outreach and not seeming culty seem to be intertwined in a weird way. It is obvious to me that being The Esoteric Order Of LessWrong doesn't do the world any favors (or us, for that matter), but that by working on outreach, we can be accused of proselytizing. I think it comes down to doing what works without doing the death spiral stuff. And it seems to me that no matter what is done, detractors are going to detract.

Comment author: NancyLebovitz 15 March 2012 11:14:11AM 8 points [-]

If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

That's an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary-- when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I'd heard in the course of a lot of years of listening to NPR.

I'm not saying it's so magical that it can't be equaled, I'm saying that it might be worth studying.

Comment author: John_Maxwell_IV 15 March 2012 01:14:04AM 13 points [-]

Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.

An acquaintance of mine attends Less Wrong meetups and describes most of his friends as being Less Wrongers, but doesn't read Less Wrong and privately holds reservations about the entire singularity thing, saying that we can't hope to say much about the future more than 10 years in advance. He told me that one of his coworkers is also skeptical of the singularity.

A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".

I was interviewing for an internship once, and the interviewer and I realized we had a mutual acquaintance who was a Less Wronger and SI donor. He asked me if I was part of that entire group, and I said yes. His attitude was a bit derisive.

Comment author: timtyler 15 March 2012 11:47:38AM 4 points [-]

The FHI are trying to do a broadly similar thing from within academia. They seem less kooky and cultish - probably as a result of trying harder to avoid cultishness.

Comment author: michaelcurzi 16 March 2012 06:55:57PM 1 point [-]

I don't know why you would assume that it's "probably as a result of trying harder to avoid cultishness." My prior is that they just don't seem cultish because academics are often expected to hold unfamiliar positions.

Comment author: BrandonReinhart 17 March 2012 11:41:11PM *  4 points [-]

I will say that I feel 95% confident that SIAI is not a cult because I spent time there (mjcurzi was there also), learned from their members, observed their processes of teaching rationality, hung out for fun, met other people who were interested, etc. Everyone involved seemed well meaning, curious, critical, etc. No one was blindly following orders. In the realm of teaching rationality, there was much agreement it should be taught, some agreement on how, but total openness to failure and finding alternate methods. I went to the minicamp wondering (along with John Salvatier) whether the SIAI was a cult and obtained lots of evidence to push me far away from that position.

I wonder if the cult accusation in part comes from the fact that it seems too good to be true, so we feel a need for defensive suspicion. Rationality is very much about changing one's mind and thinking about this we become suspicious that the goals of SIAI are to change our minds in a particular way. Then we discover that in fact the SIAI's goals (are in part) to change our minds in a particular way so we think our suspicions are justified.

My model tells me that stepping into a church is several orders of magnitude more psychologically dangerous than stepping into a Less Wrong meetup or the SIAI headquarters.

(The other 5% goes to things like "they are a cult and totally duped me and I don't know it", "they are a cult and I was too distant from their secret inner cabals to discover it", "they are a cult and I don't know what to look for", "they aren't a cult but they want to be one and are screwing it up", etc. I should probably feel more confident about this than 95%, but my own inclination to be suspicious of people who want to change how I think means I'm being generous with my error. I have a hard time giving these alternate stories credit.)

Comment author: [deleted] 15 March 2012 07:09:48PM *  4 points [-]

Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.

I would consider myself a pretty far outlier on LessWrong (as a female, ENFP (people-person, impulsive/intuitive), Hufflepuff type). So on one hand, my opinion may mean less, because I am not generally the "type" of person associated with LW. On the other hand, if you want to expand LW to more people, then I think some changes need to be made for other "types" of people to also feel comfortable here.

Along with the initial "cult" impression (which eventually dissipates, IMO), what threw me most is the harshness of the forums. I've been on here for about 4 months now, and it's still difficult for me to deal with. Also, I agree that topics like FAI, and Singularitarianism aren't necessarily the best things to be discussing when trying to get people interested in rationality.

I am well-aware that the things that would make LW more comfortable for me and others like me, would make it less comfortable for many of the current posters. So there is definitely a conflict of goals.

Goal A- Grow LW and make rationality more popular- Need to make LW more "nice" and perhaps focused on Instrumental Rationality rather than Singularity and FAI issues.

Goal B- Maintain current culture and level of posts.- Need to NOT significantly change LW, and perhaps focus more on the obscure posts that are extremely difficult for newer people to understand.

AFAICT pursuit of either of these goals will be at the detriment of the other goal.

Comment author: NancyLebovitz 15 March 2012 10:56:48PM 0 points [-]

Could you be more specific about what comes off as harsh to you?

If you'd rather address this as a private message, I'm still interested.

Comment author: John_Maxwell_IV 15 March 2012 11:11:32PM 3 points [-]

What comes across as harsh to me: down voting discussion posts because they're accidental duplicates/don't fit some idea of what a discussion post is supposed to be, a lot of down voting that goes on in general, unbridled or curt disagreement (like grognor's response to my post. You saw him cursing and yelling, right? I made this post because I thought the less wrong community could use optimization on the topics I wrote about, not because I wanted to antagonize anyone.)

Comment author: [deleted] 15 March 2012 11:37:51PM 0 points [-]

PM'd response. General agreement with John below (which I didn't see until just now).

Comment author: Nisan 15 March 2012 07:29:27PM *  2 points [-]

A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".

This person might have been in the same place as a math grad student I know. They read a little Less Wrong and were turned off. Then they attended a LW-style rationality seminar and responded positively, because it was more "compassionate". What they mean is this: A typical epistemology post on Less Wrong might sound something like

There are laws of probability; you can't just make up beliefs.

(That's not a quote.) Whereas the seminar sounded more like

We'll always have uncertainty, and we'll never be perfectly calibrated, but we can aspire to be better-calibrated.

Similarly, an instrumental-rationality post here might sound like

To the extent you fail to maximize some utility function, you can be Dutch-booked. Give me a penny to switch between these two gambles; give me another penny to switch back again. There: You have given me your two cents on the matter.

Whereas the seminar sounds more like

You must decide alone.
But you are not alone.

Of course, both approaches are good and necessary, and you can find both on Less Wrong.

Comment author: Daniel_Starr 22 March 2012 01:30:04PM *  3 points [-]

Ironically, I suspect the "cultlike" problem is that LessWrong/SI's key claims lack falsifiability.

Friendly AI? In the far future.

Self-improvement? All mental self-improvement is suspected of being a cult, unless it trains a skill outsiders are confident they can measure.

If I have a program for teaching people math, outsiders feel they know how they can check my claims - either my graduates are good at math or not.

But if I have a program for "putting you in touch with your inner goddess", how are people going to check my claims? For all outsiders know I'm making people feel good, or feel good about me, without actually making them meaningfully better.

Unfortunately, the external falsifiability of LW/SI's merits is more like the second case than the first. Especially, I suspect, for people who aren't already big fans of mathematics, information theory, probability, and potential AI.

Organization claims to improve a skill anyone can easily check = school. Organization claims to improve a quality that outsiders don't even know how to measure = cult.

If and when LW/SI can headline more easily falsifiable claims, it will be less cultlike.

I don't know if this is an immediately solvable problem, outside of developing other aspects of LW/SI that are more obviously useful/impressive to outsiders, and/or developing a generation of LW/SI fans who are indeed "winners" as rationalists ideally would be.

Comment author: roryokane 24 May 2014 06:18:57AM 1 point [-]

PredictionBook might help with measuring improvement, in a limited way. You can use it to measure how often your predictions are correct, and whether you are getting better over time. And you could theoretically ask LW-ers and non-LW-ers to make some predictions on PredictionBook, and then compare their accuracy to see if Less Wrong helped. Making accurate predictions of likelihood is a real skill that certainly has the possibility to be very useful – though it depends on what you’re predicting.

Comment author: Bugmaster 15 March 2012 08:33:10PM 12 points [-]

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

What do you mean, "initially" ? I am still getting that impression ! For example, just count the number of times Eliezer (who appears to only have a single name, like Prince or Jesus) is mentioned in the other comments on this post. And he's usually mentioned in the context of, "As Eliezer says...", as though the mere fact that it is Eliezer who says these things was enough.

The obvious counter-argument to the above is, "I like the things Eliezer says because they make sense, not because I worship him personally", but... well... that's what one would expect a cultist to say, no ?

Less Wrongers also seem to have their own vocabulary ("taboo that term or risk becoming mind-killed, which would be un-Bayesian"). We spend a lot of time worrying about doomsday events that most people would consider science-fictional (at best). We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality. As far as warning signs go, we've got it covered...

Comment author: epicureanideal 16 March 2012 02:41:45AM 8 points [-]

Specialized terminology is really irritating to me personally, and off-putting to most new visitors I would think. If you talk to any Objectivists or other cliques with their own internal vocabulary, it can be very bothersome. It also creates a sense that the group is insulated from the rest of the world, which adds to the perception of cultishness.

Comment author: [deleted] 15 March 2012 08:41:58PM 6 points [-]

We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality

I think the phrase 'raising the sanity waterline' is a problem. As is the vaguely religious language, like 'litany of Tarski'. I looked up the definiton of 'litany' to make sure I was picking up on a religious denotation and not a religious connotation, and here's what I got:

A series of petitions for use in church services, usually recited by the clergy and responded to in a recurring formula by the people.

Not a great word, I think. Also 'Bayesian Conspiracy.' There's no conspiracy, and there shouldn't be.

Comment author: Bugmaster 15 March 2012 08:57:01PM 7 points [-]

Agreed. I realize that the words like "litany" and "conspiracy" are used semi-ironically, but a newcomer to the site might not.

Comment author: William_Quixote 07 September 2012 03:16:31AM 1 point [-]

This wording may lose a few people, but it probably helps for many people as well. The core subject matter of rationality could very easily be dull or dry or "academic". The tounge-in-cheek and occasionally outright goofy humor makes the sequences a lot more fun to read.

The tone may have costs, but not being funny has costs too. If you think back to college, more professors have students tune out by being boring than by being esoteric.

Comment author: Jiro 20 May 2014 05:35:28PM *  0 points [-]

(Responding to old post.)

One problem with such ironic usage is that people tend to joke about things that cause themselves stress, and that includes uncomfortable truths or things that are getting too close to the truth. It's why it actually makes sense to detain people making bomb jokes in airports. So just because the words are used ironically doesn't mean they can't reasonably be taken as signs of a cult--even by people who recognize that they are being used ironically.

(Although this is somewhat mitigated by the fact that many cults won't allow jokes about themselves at all.)

Comment author: Eneasz 16 March 2012 10:12:46PM 1 point [-]

You'd have to be new to the entire internet to think those are being used seriously. And if you're THAT new, there's really very little that can be done to prevent misunderstanding no matter where you first land.

On top of that, it's extremely unlikely someone very new to the internet would start their journey at LessWrong

Comment author: Martin-2 21 October 2012 07:33:28AM 0 points [-]

Eliezer (who appears to only have a single name, like Prince or Jesus)

Mr. Jesus H. Christ is a bad example. Also there's this.

Comment author: Viliam_Bur 15 March 2012 01:20:40PM 11 points [-]

Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that's exactly what a cult would do, isn't it?

To be accused is to be convicted, because the allegation is unfalsifiable.

Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.

The best way to win is to avoid the topic.

Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: "So, what exactly is that evil thing people on LW did? Downvote someone's forum post? Seriously, guys, you need to get some life."

And now, everybody stop worrying and get some life. ;-)

It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discussing best diet habits don't seem like a doomsday cult, right?

The Sequences could be recommended somewhat differently, for example: "In this forum we sometimes discuss some complicated topics. To make the discussion more efficient and avoid endlessly repeating the same arguments about statistics, evolution, quantum mechanics, et cetera, it is recommended to read the Sequences." Not like 'you have to do this', but rather like 'read the FAQ, please'. Also in discussion, instead of "read the Sequences" it is better to recommend one specific sequence, or one article.

Relax, be friendly. But don't hesitate to downvote a stupid post, even if the downvotee threatens to accuse you of whatever.

Comment author: roystgnr 15 March 2012 05:12:07PM 7 points [-]

People discussing best diet habits don't seem like a doomsday cult, right?

I'm having trouble thinking up examples of cults, real or fictional, that don't take an interest in what their members eat and drink.

Comment author: epicureanideal 16 March 2012 02:34:55AM 2 points [-]

I don't think the best way to win is to avoid the topic. A healthy discussion of false impressions and how to correct them, or other failings a group may have, is a good indication to me of a healthy community. This post for example caused my impression of LW to increase somewhat, but some of the responses to it have caused my impression to decrease below its original level.

Comment author: Viliam_Bur 16 March 2012 08:44:05AM *  3 points [-]

Then let's discuss "false impressions" or even better "impressions" in general, not focusing on cultishness, which even cannot be defined (because there are so many different kind of cults). If we focus on making things right, we do not have to discuss hundred ways they could go wrong.

What is our community (trying to be) like?

Friendly. In more senses of the word: we speak about ethics, we are trying to make a nice community, we try to help each other become stronger and win.

Rational. Instead of superstition and gossip, we discuss how and why things really happen. Instead of happy death spirals, we learn about the world around us.

Professional. By that I do not mean that everyone here is an AI expert, but that the things we do and value here (studying, politeness, exactness, science) are things that for most people correlate positively with their jobs, rather than free time. Even when we have fun, it's adult people having fun.

So where exactly in the space of human organizations do we belong? Which of the cached-thoughts can be best applied to us? People will always try to fit us to some existing model (for example: cult), so why not choose this model rationally? I am not sure, but "educational NGO" sounds close. Science, raising the sanity waterline, et cetera. By seeming as something well-known, we become less suspicious, more normal.

Comment author: MugaSofer 10 January 2013 12:15:46PM 1 point [-]

The Sequences could be recommended somewhat differently, for example: "In this forum we sometimes discuss some complicated topics. To make the discussion more efficient and avoid endlessly repeating the same arguments about statistics, evolution, quantum mechanics, et cetera, it is recommended to read the Sequences." Not like 'you have to do this', but rather like 'read the FAQ, please'. Also in discussion, instead of "read the Sequences" it is better to recommend one specific sequence, or one article.

This.

Seriously, we need to start doing all the stuff recommended here, but this is perhaps the simplest and most immediate. Someone go do it.

Comment author: Grognor 15 March 2012 02:22:38AM *  44 points [-]

AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.

I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:

Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

&

Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Really, am I the only one seeing the problem with this?

People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they parrot out "LW looks kinda culty to me" or whatever. It's like people only want to convey pure connotation. Remember sneaking in connotations, and how you're not supposed to do that? How about, instead of saying "LW is a cult", "LW is bad for its members"? This is an actual message, one that speaks negatively of LW but contains more information than negative affective valence. Speaking of which, one of the primary indicators of culthood is being unresponsive or dismissal of criticism. People regularly accuse LW of this, which is outright batshit. XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong. Not to mention all the other posts (more) disagreeing with claims in what are usually called the Sequences, all highly upvoted by Less Wrong members.

The more people at Less Wrong naively wax speculatively on how the community appears from the outside, throwing around vague negative-affective-valence words and phrases like "cult" and "telling people exactly how they should be", the worse this community will be perceived, and the worse this community will be. I reiterate: I am sick to death of people playing color politics on "whether LW is a cult" without doing any of making the discussion precise and explicit rather than vague and implicit, taking into account that dissent is not only tolerated but encouraged here, remembering that their brains instantly mark "cult" as being associated to wherever it's seen, and any of a million other factors. The "million other factors" is, I admit, a poor excuse, but I am out of breath and emotionally exhausted; forgive the laziness.

Everything that should have needed to be said about this has been said in the Cult Attractor sequence, and, from the Less Wrong wiki FAQ:

We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all. Yes, some of the results that fall out of these basics sound weird if you haven't seen the reasoning behind them, but there's nothing in the laws of physics that prevents reality from sounding weird.

Talking about this all the time makes it worse, and worse every time someone talks about it.

What the bleeding fuck.

Comment author: cousin_it 15 March 2012 07:06:38AM *  25 points [-]

LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.

Comment author: Wei_Dai 15 March 2012 08:40:52AM 13 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person

You mean when he saw himself in the mirror? :)

Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?

Comment author: cousin_it 15 March 2012 11:02:10AM *  6 points [-]

I think it's not an ethical imperative unless you're unusually altruistic.

Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.

Comment author: Wei_Dai 16 March 2012 09:02:38PM 6 points [-]

It's hard not to rewrite someone's goals while educating them, because one of our inborn drives is to gain the respect and approval of people around us, and if that means overwriting some of our goals, well that's a small price to pay as far as that part of our brain is concerned. For example, I stayed for about a week at the SIAI house a few years ago when attending the decision theory workshop, and my values shift in obvious ways just by being surrounded by more altruistic people and talking with them. (The effect largely dissipated after I left, but not completely.)

I think it's not an ethical imperative unless you're unusually altruistic.

Presumably the people they selected for the rationality mini-camp were already more altruistic than average, and the camp itself pushed some of them to the "unusually altruistic" level. Why should SIAI people have qualms about this (other than possible bad PR)?

Comment author: Vladimir_Nesov 15 March 2012 05:22:46PM 6 points [-]

I think it's not an ethical imperative unless you're unusually altruistic.

I don't think "unusually altruistic" is a good characterization of "doesn't value personal preferences about some life choices more than the future of humanity"...

Comment author: Vaniver 16 March 2012 10:59:57PM 4 points [-]

It doesn't sound like you know all that many humans, then. In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.

Comment author: Vladimir_Nesov 16 March 2012 11:08:02PM *  1 point [-]

In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.

I was talking about the future of humanity, not the "future of humanity" (a label that can be grossly misinterpreted).

Comment author: cousin_it 15 March 2012 05:38:33PM *  3 points [-]

Do you believe most people are already quite altruistic in that sense? Why? It seems to me that many people give lip service to altruism, but their actions (e.g. reluctance to donate to highly efficient charities) speak otherwise. I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

Comment author: Vladimir_Nesov 15 March 2012 05:57:12PM 7 points [-]

I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.

Comment author: cousin_it 15 March 2012 06:31:56PM *  8 points [-]

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)

Comment author: Vladimir_Nesov 15 March 2012 06:46:13PM *  5 points [-]

I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.

Comment author: cousin_it 15 March 2012 07:09:11PM *  1 point [-]

Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.

Comment author: William_Quixote 07 September 2012 03:27:04AM 1 point [-]

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

if prediction markets were legal, we could much more easily measure if LW helped rationality. Just ask people to make n bets or predictions per month and see 1) it they did better than the population average and 2) if they improved over time.

In fact, trying to get intrade legal in the US might be a very worthwhile project for just this reason ( beyond all the general social reasons to like prediction markets)

Comment author: gwern 07 September 2012 05:40:48PM *  3 points [-]

There is no need to wish or strive for regulatory changes that may never happen: I've pointed out in the past that non-money prediction markets generally are pretty accurate and competitive with money prediction markets; so money does not seem to be a crucial factor. Just systematic tracking and judgment.

(Being able to profit may attract some people, like me, but the fear of loss may also serve as a potent deterrent to users.)

I have written at length about how I believe prediction markets helped me but I have been helped even more by the free active you-can-sign-up-right-now-and-start-using-it,-really,-right-now http://www.PredictionBook.com

I routinely use LW-related ideas and strategies in predicting, and I believe my calibration graph reflects genuine success at predicting.

Comment author: cousin_it 07 September 2012 08:22:23AM 1 point [-]

Very nice idea, thanks! After some googling I found someone already made this suggestion in 2009.

Comment author: Luke_A_Somers 15 March 2012 06:29:56PM 1 point [-]

I think it's not an ethical imperative unless you're unusually altruistic.

... or you estimate the risk to be significant and you want to live past the next N years.

Comment author: cousin_it 15 March 2012 06:40:59PM *  7 points [-]

I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.

Comment author: RichardKennaway 15 March 2012 07:27:06AM 14 points [-]

How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?

"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."

Comment author: cousin_it 15 March 2012 07:52:37AM *  9 points [-]

I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.

Comment author: Will_Newsome 19 March 2012 03:13:16AM *  9 points [-]

(the way Will_Newsome wants you to),

I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.

Comment author: XiXiDu 15 March 2012 10:54:49AM *  7 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

Comment author: cousin_it 15 March 2012 12:35:19PM *  11 points [-]

If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.

Comment author: Will_Newsome 19 March 2012 03:18:01AM 5 points [-]

Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.

Comment author: katydee 16 March 2012 05:29:43PM 10 points [-]

I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.

Comment author: Will_Newsome 19 March 2012 03:22:56AM *  9 points [-]

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...

This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.

— Nick Tarleton's twist on T.S. Eliot

Comment author: Tripitaka 15 March 2012 06:15:47PM *  6 points [-]
  1. Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money.
  2. As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised!
  3. Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven't had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
Comment author: drethelin 16 March 2012 05:42:30PM 0 points [-]

I'm too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.

Comment author: Gabriel 15 March 2012 07:17:30PM 3 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.

Comment author: [deleted] 07 August 2012 11:52:36PM 0 points [-]

Pascal? Izzat you?

Comment author: Gabriel 08 August 2012 09:04:41PM 0 points [-]

That comment doesn't actually argue for contributing to FAI development. So I guess I'm not Pascal (damn).

Comment author: [deleted] 08 August 2012 11:13:50PM 1 point [-]

You probably don't wanna be Pascal anyway. I'm given to understand he's been a metabolic no-show for about 350 years.

Comment author: Grognor 15 March 2012 07:24:13AM 1 point [-]

I agree entirely. That post made me go "AAAH" and its rapid karma increase at first made me go "AAAAHH"

Comment author: John_Maxwell_IV 15 March 2012 03:14:19AM *  21 points [-]

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.

In other words, I'm talking about first impressions, not in-depth discussions.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.

Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)

And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think we could stand to be a bit nicer everywhere).

XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong. Not to mention all the other posts disagreeing with claims in what are usually called the Sequences, all highly upvoted by Less Wrong members.

Criticism rocks dude. I'm constantly realizing that I did something wrong and thinking that if I had a critical external observer maybe I wouldn't have persisted in my mistake for so long. Let's keep this social norm up.

Comment author: Grognor 15 March 2012 04:18:30AM 10 points [-]

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.

Okay.

If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.

I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.

Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.

Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).

Criticism rocks dude.

Okay. (I hope you didn't interpret anything I said as meaning otherwise.)

Comment author: John_Maxwell_IV 15 March 2012 04:25:14AM *  4 points [-]

Point taken; I'll leave the issue alone for now.

Comment author: Antisuji 16 March 2012 12:59:50AM *  5 points [-]

Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living. Those people tend to be on the far end of the spectrum of what we call neurotypical here. That is, they are extremely good at modeling other people, and therefore predicting how other people will react to a sample of copy. I would not be surprised if literally no one who reads LW regularly could do the job adequately.

Edit to add: it's nice to see that they're attempting to do this, but again, LW readership is probably the wrong place to look for this kind of expertise.

Comment author: wedrifid 16 March 2012 02:24:16AM 6 points [-]

Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living.

People who do this for a living (effectively) cost a lot of money. Given the budget of SIAI putting a communications professional on the payroll at market rates represents a big investment. Transitioning a charity to a state where a large amount of income goes into improving perception (and so securing more income) is a step not undertaken lightly.

Comment author: NancyLebovitz 16 March 2012 09:49:12AM 6 points [-]

It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.

Comment author: Antisuji 16 March 2012 03:59:58AM 1 point [-]

That's a good point. I'm guessing though that there's a lot of low hanging fruit, e.g. a front page redesign, that would represent a more modest (and one-time) expense than hiring a full-time flack. In addition to costing less this would go a long way to mitigate concerns of corruption. Let's use the Pareto Principle to our advantage!

Comment author: wedrifid 15 March 2012 03:52:19AM 13 points [-]

AAAAARRRGH! I am sick to death of this damned topic.

It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.

Comment author: John_Maxwell_IV 15 March 2012 11:46:50PM 3 points [-]

Unfortunately, Grognor reverts me every time I try to make those changes... Bystanders, please weigh in on this topic here.

Comment author: Vladimir_Nesov 16 March 2012 12:10:26AM *  3 points [-]

I didn't like your alternative for the "Many of us believe" line either, even though I don't like that line (it was what I came up with to improve on Luke's original text). To give the context: the current About page introduces twelve virtues with:

Many of us believe in the importance of developing qualities described in "Twelve Virtues of Rationality":

John's edit was to change it to:

For a brief summary of how to be rational, read the somewhat stylized "Twelve Virtues of Rationality":

P.S. I no longer supervise the edits to the wiki, but someone should...

Comment author: John_Maxwell_IV 16 March 2012 12:56:30AM *  0 points [-]

He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.

Comment author: wedrifid 16 March 2012 02:51:29AM 3 points [-]

He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.

If you made the suggestions here and received public support for one of them it wouldn't matter much what Grognor thought.

Comment author: John_Maxwell_IV 16 March 2012 03:01:41AM 0 points [-]

Why don't you make a suggestion?

Comment author: wedrifid 16 March 2012 03:04:48AM *  3 points [-]

*cough* Mine is 'delete the sentence entirely'. I never really liked that virtues page anyway!

Comment author: John_Maxwell_IV 16 March 2012 03:06:17AM 2 points [-]

Sounds like a great idea.

Comment author: lessdazed 20 March 2012 07:50:08PM 1 point [-]

I entirely agree with this.

Comment author: John_Maxwell_IV 17 March 2012 05:52:39AM 1 point [-]

To be clear, you are in favor of leaving the virtues off of the about page, correct?

Comment author: wedrifid 17 March 2012 06:01:03AM 1 point [-]

For what it is worth, yes.

Comment author: XiXiDu 15 March 2012 10:49:06AM 6 points [-]

People regularly accuse LW of this, which is outright batshit. XiXiDu regularly posts SIAI criticism, and it always gets upvoted, no matter how wrong.

Thanks for saying that I significantly helped to make Less Wrong look less cultish ;-)

By the way...

Comment author: jimrandomh 15 March 2012 07:58:14PM 1 point [-]

Actually, I believe what he said was that you generated evidence that Less Wrong is not cultish, which makes it look more cultish to people who aren't thinking carefully.

Comment author: epicureanideal 16 March 2012 02:48:30AM 9 points [-]

A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:

"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"

Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)

"LW is a cult hur hur"

People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.

"since it's exactly what Eliezer was trying to combat by writing it."

There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.

Comment author: dbaupp 15 March 2012 02:48:49AM *  7 points [-]

Eliezer: Do not worship a hero! Do not trust!

Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

A widely revered figure who has written a million+ words that form the central pillars of LW and has been directly (or indirectly) responsible for bringing many people into the rationality memespace says "don't do X" so it is obvious that X must be false.

</sarcasm>

Dismissing accusations of a personality cult around Eliezer by saying Eliezer said "no personality cult" is a fairly poor way of going about it. Two key points:

  • saying "as a group, we don't worship Eliezer" doesn't guarantee that it is true (groupthink could easily suck us into ignore evidence)
  • someone might interpret what Eliezer said as false modesty or an attempt to appear to be a reluctant saviour/messiah (i.e. using dark arts to suck people in)
Comment author: XiXiDu 15 March 2012 11:01:26AM *  5 points [-]

I have become fully convinced that even bringing it up is actively harmful.

No, it is not. A lack of self-criticism and evaluation is one of the reasons for why people assign cult status to communities.

P.S. Posts with titles along the lines of 'Epistle to the New York Less Wrongians' don't help in reducing cultishness ;-)

(Yeah, I know it was just fun.)

Comment author: epicureanideal 16 March 2012 02:54:26AM 2 points [-]

"I have become fully convinced that even bringing it up is actively harmful."

What evidence leads you to this conclusion?

Eliezer: Please, learn what turns good ideas into cults, and avoid it! Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

Can you provide evidence to support this characterization?

Eliezer: Do not worship a hero! Do not trust! Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Can you provide evidence to support this characterization?

I would like to see some empirical analysis of the points made here and by the original poster. We should gather some data about perceptions from real users and use that to inform future discussion on this topic. I think we have a starting point in the responses to this post, and comments in other posts could probably be mined for information, but we should also try to find some rational people who are not familiar with less wrong and introduce them to it and ask them for their impressions (from someone acting like they just found the site, are not affiliated with it, and are curious about their friend's impressions, or something like that).

Comment author: fubarobfusco 15 March 2012 07:00:34AM *  9 points [-]

Disclaimer: My partner and I casually refer to LW meetups (which I attend and she does not) as "the cult".

That said, if someone asked me if LW (or SIAI) "was a cult", I think my ideal response might be something like this:

"No, it's not; at least not in the sense I think you mean. What's bad about cults is not that they're weird. It's that they motivate people to do bad things, like lock kids in chain-lockers, shun their friends and families, or kill themselves. The badness of being a cult is not being weird; it's doing harmful things — and, secondarily, in coming up with excuses for why the cult gets to do those harmful things. Less Wrong is weird, but not harmful, so I don't think it is a cult in the sense you mean — at least not at the moment.

"That said, we do recognize that "every cause wants to be a cult", that human group behavior does sometimes tend toward cultish, and that just because a group says 'Rationality' on the label does not mean it contains good thinking. Hoping that we're special and that the normal rules of human behavior don't apply to us, would be a really bad idea. It seems that staying self-critical, understanding how cults happen and why, and consciously taking steps to avoid making in-group excuses for bad behavior or bad thinking, is a pretty good strategy for avoiding becoming a cult."

Comment author: Viliam_Bur 16 March 2012 09:07:04AM 14 points [-]

What's bad about cults is not that they're weird. It's that they motivate people to do bad things...

People use "weird" as a heuristic for danger, and personally I don't blame them, because they have good Bayesian reasons for it. Breaking a social norm X is positively correlated with breaking a social norm Y, and the correlation is strong enough for most people to notice.

The right thing to do is to show enough social skill to avoid triggering the weirdness alarm signal. (Just like publishing in serious media is the right thing to avoid the "pseudoscience" label.) You cannot expect that outsiders will do an exception for LW, suspend their heuristics and explore the website deeply; that would be asking them to privilege a hypothesis.

If something is "weird", we should try to make it less weird. No excuses.

Comment author: ryjm 21 March 2012 01:42:26PM 0 points [-]

So we should be Less Weird now? ;)

Comment author: Viliam_Bur 21 March 2012 03:22:02PM *  2 points [-]

We should be winning.

Less Weird is a good heuristic for winning (though a bad heuristic for a site name ).

Comment author: roystgnr 15 March 2012 05:33:40PM 7 points [-]

Often by the time a cult starts doing harmful things, its members have made both real and emotional investments that turn out to be nothing but sunk costs. To avoid ever getting into such a situation, people come up with a lot of ways to attempt to identify cults based on nothing more than the non-harmful, best-foot-forward appearance that cults first try to project. If you see a group using "love bombing", for instance, the wise response is to be wary - not because making people feel love and self-esteem is inherently a bad thing, but because it's so easily and commonly twisted toward ulterior motives.

Comment author: CasioTheSane 15 March 2012 11:43:59AM *  2 points [-]

Less Wrong is weird, but not harmful

That is until people start bombing factories to mitigate highly improbable existential risks.

Comment author: lessdazed 20 March 2012 07:57:36PM 2 points [-]

you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency.

Maybe a substantial number of people are searching for the posts about cultishness.

Comment author: quantropy 16 March 2012 11:34:48AM *  3 points [-]

As I see it Cult =Clique + Weird Ideas

I think the weird ideas are an integral part of LessWrong, and any attempt to disguise them with a fluffy introduction would be counterproductive.

What about Cliquishness? I think that the problem here is that any internet forum tends to become a clique. To take part you need to read through lots of posts, so it requires quite a commitment. Then there is always some indication of your status within the group - Karma score in this case.

My advice would be to link to some non-internet things. Why not have the FHI news feed and links to a few relevant books on Amazon in the column on the right?

Comment author: wedrifid 15 March 2012 03:49:22AM 3 points [-]

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

Somebody please do so. Those examples are just obviously bad.

Comment author: Locke 15 March 2012 02:21:24AM *  3 points [-]

It's a rather unavoidable side-effect of claiming that you know the optimal way to fulfill one's utility function, especially if that claim sounds highly unusual (Unite the human species in making a friendly AI that will create Utopia). There are many groups that make such claims, and either one or none of them can be right. Most people(Who haven't already bought into a different philosophy of life) think it's the later, and thus tend not to take someone seriously when they make extraordinary claims.

Until recognition of the Singularity's imminence and need for attention enters mainstream scientific thought, the people most likely to join us (Scientifically-Literate Atheists and Truth-Lovers) will not seriously consider our claims. I haven't read nearly as much about the nonexistence of Zues as I have about the nonexistence of Yaweh, because the number of intelligent people who believe in Zues is insignificant compared to the number of educated Christians. So when 99% of the developed world isn't focusing on friendly-AI-theory, it was difficult for me to come to the conclusion that Richard Dawkins and Stephen Hawking and Stephen Fry were all ignorant of one of the most important things on the planet. A few months ago I gave no more thought to cryonics than to cryptozoology, and without MoR I doubt anything would have changed.

Comment author: Voltairina 15 March 2012 03:01:16AM *  2 points [-]

Is the goal of the community really to get everyone into the one task of creating FAI? I'm kind of new here, but I'm personally interested in a less direct but maybe more certain (I don't know the hard numbers) (but, I feel, its synergistic), goal of achieving a stable post-scarcity economy which could free up a lot more people to become hackers/makers of technology and participate in the collective commons, but I'm interested in FAI and particularly machine ethics, and I hang out here because of the rationality and self improvement angles. In fact I got into my current academic track (embedded systems) because I'm interested in robotics and embodied intelligence, and probably got started reading Hofstadter stuff and trying to puzzle out how minds work.

"Come for the rationality... stay for the friendly AI" maybe?

Comment author: Grognor 15 March 2012 03:24:01AM 4 points [-]

Is the goal

Please don't talk about 'the' goal of the community as if there's only one. There are many.

Comment author: Voltairina 15 March 2012 03:40:08AM 0 points [-]

That's what I was wondering, thank you for providing the link to that post. I wasn't sure how to read Locke's statement.

Comment author: RobertLumley 15 March 2012 12:50:03AM *  5 points [-]

"Twelve Virtues of Rationality" has always seemed really culty to me. I've never read it, which may be part of the reason. It just comes across as telling people exactly how they should be, and what they should value.

Also, I've never liked that quote about the Sequences. I agree with it, what I've read of the sequences (and it would be wrong to not count HPMOR in this) is by far the most important work I've ever read. But that doesn't mean that's what we should advertise to people.

Comment author: Grognor 15 March 2012 02:35:51AM 5 points [-]

All you are saying here is "The title of the Twelve Virtues makes me feel bad." That is literally all you are saying, since you admit to not having read it.

I quote:

It's supposed to be strange. Strange gets attention. Strange sticks in the mind. Strange makes the truth memorable. Other suggestions are possible, I guess, but can the result be equally strange?

I'll tell you one thing. It got my attention. It got me interested in rationality. I've shown it to others; they all liked it or were indifferent. If you're going to say "culty" because of the title, you are both missing the (most important) point and failing to judge based on anything reasonable. And I don't particularly care if LW appeals to people who don't even try to be reasonable.

Comment author: Gabriel 15 March 2012 07:38:18PM 4 points [-]

All you are saying here is "The title of the Twelve Virtues makes me feel bad." That is literally all you are saying, since you admit to not having read it.

That's still an useful data-point. Do we want to scare away people with strong weirdness filters?

Comment author: komponisto 15 March 2012 08:28:50PM *  8 points [-]

Do we want to scare away people with strong weirdness filters?

The answer to this may very well turn out to be yes.

Comment author: NancyLebovitz 15 March 2012 10:53:50PM 2 points [-]

What proportion of top people at SIAI love sf?

It's at least plausible that strong weirdness filters interfere with creativity.

On the other hand, weirdness is hard to define-- sf is a rather common sort of weirdness these days..

Comment author: John_Maxwell_IV 17 March 2012 05:58:52AM 1 point [-]

There is no reason to turn them off right away. The blog itself is weird enough. Maybe they will be acclimated, which would be good.

Comment author: John_Maxwell_IV 15 March 2012 01:05:01AM *  3 points [-]

I almost forgot this, but I was pretty put off by the 12 virtues as well when I first came across it on reddit at age 14 or so. My reaction was something like "you're telling me I should be curious? What if I don't want to be curious, especially about random stuff like Barbie dolls or stamp collecting?" I think I might have almost sent Eliezer an e-mail about it.

When you put this together with what Eliezer called "the bizarre "can't get crap done" phenomenon that afflicts large fractions of our community, which he attributes to feelings of low status, this paints a picture of LW putting off the sort of person who is inclined to feel high status (and is therefore good at getting crap done, but doesn't like being told what to do). This may be unrelated to the cult issue.

Of course, these hypothetical individuals who are inclined to feel high status might not like being told how to think better either... which could mean that Less Wrong is not their cup of tea under any circumstances. But I think it makes sense to shift away from didacticism on the margin.

Comment author: aaronsw 05 August 2012 11:22:53PM *  5 points [-]

I think the biggest reason Less Wrong seems like a cult is because there's very little self-skepticism; people seem remarkably confident that their idiosyncratic views must be correct (if the rest of the world disagrees, that's just because they're all dumb). There's very little attempt to provide any "outside" evidence that this confidence is correctly-placed (e.g. by subjecting these idiosyncratic views to serious falsification tests).

Instead, when someone points this out, Eliezer fumes "do you know what pluralistic ignorance is, and Asch's conformity experiment? ... your smart friends and favorite SF writers are not remotely close to the rationality standards of Less Wrong".

What's especially amusing is that EY is able to keep this stuff up by systematically ignoring every bit of his own advice: telling people to take the outside view and then taking the inside one, telling people to look into the dark while he studiously avoids it, emphasizing the importance of AI safety while he embarks on an extremely dangerous way of building AI -- you can do this with pretty much every entry in the sequences.

These are the sorts of things that make me think LessWrong is most interesting as a study in psychoceramics.

Comment author: John_Maxwell_IV 06 August 2012 07:23:19AM 3 points [-]

There's very little attempt to provide any "outside" evidence that this confidence is correctly-placed (e.g. by subjecting these idiosyncratic views to serious falsification tests).

Offhand, can you think of a specific test that you think ought to be applied to a specific idiosyncratic view?


My read on your comment is: LWers don't act humble, therefore they are crackpots. I agree that LWers don't always act humble. I think it'd be a good idea for them to be more humble. I disagree that lack of humility implies crackpottery. In my mind, crackpottery is a function of your reasoning, not your mannerisms.

Your comment is a bit short on specific failures of reasoning you see--instead, you're mostly speaking in broad generalizations. It's fine to have general impressions, but I'd love to see a specific failure of reasoning you see that isn't of the form "LWers act too confident". For example, a specific proposition that LWers are too confident in, along with a detailed argument for why. Or a substantive argument for why SI's approach to AI is "extremely dangerous". (I personally know pretty much everyone who works for SI, and I think there's a solid chance that they'll change their approach if your argument is good enough. So it might not be a complete waste of time.)

you can do this with pretty much every entry in the sequences

Now it sounds like you're deliberately trying to be be inflammatory ಠ_ಠ

Comment author: aaronsw 06 August 2012 11:50:08AM 4 points [-]

Offhand, can you think of a specific test that you think ought to be applied to a specific idiosyncratic view?

Well, for example, if EY is so confident that he's proven "MWI is obviously true - a proposition far simpler than the argument for supporting SIAI", he should try presenting his argument to some skeptical physicists. Instead, it appears the physicists who have happened to run across his argument found it severely flawed.

How rational is it to think that you've found a proof most physicists are wrong and then never run it by any physicists to see if you're right?

My read on your comment is: LWers don't act humble, therefore they are crackpots.

I do not believe that.

As for why SI's approach is dangerous, I think Holden put it well in the most upvoted post on the site.

I'm not trying to be inflammatory, I just find it striking.

Comment author: Mitchell_Porter 07 August 2012 10:07:38AM *  8 points [-]

it appears the physicists who have happened to run across his argument found it severely flawed

The criticisms at those links have nothing to do with the argument for MWI. They are just about a numerical mistake in an article illustrating how QM works.

The actual argument for MWI that is presented is something like this: Physicists believe that the wavefunction is real and that it collapses on observation, because that is the first model that explained all the data, and science holds onto working models if they are falsified. But we can also explain all the data by saying that the wavefunction is real and doesn't collapse, if we learn to see the wavefunction as containing multiple worlds that are equally real. The wavefunction doesn't collapse, it just naturally spreads out into separate parts and what we see is one of those separate parts. A no-collapse theory is simpler than a collapse theory because it has one less postulate, so even though there are no new predictions, by Bayes (or is it Occam?) we can favor the no-collapse theory over the collapse theory. Therefore, there are many worlds.

This is informal reasoning about which qualitative picture of the world to favor, so it is not something that can be verified or falsified by a calculation or an experiment. Therefore, it's not something that a hostile physicist could crisply debunk, even if they wanted to. In the culture of physics there are numerous qualitative issues where there is no consensus, and where people take sides on the basis of informal reasoning. Eliezer's argument is on that level; it is an expression in LW idiom, of a reason for believing in MWI that quite a few physicists probably share. It can't be rebutted by an argument along the lines that Eliezer doesn't know his physics, because it is an argument which (in another form) a physicist might actually make! So if someone wants to dispute it, they'll have to do so, just as if they were intervening in any of these informal professional disagreements which exist among physicists, by lines of argument about plausibility, future theoretical prospects, and so on.

ETA One more comment about the argument for MWI as I have presented it. Physicists don't agree that the wavefunction is real. The debate over whether it is real, goes all the way back to Schrodinger (it's a real physical object or field) vs Heisenberg (it's just a calculating device). The original Copenhagen interpretation was in Heisenberg's camp: a wavefunction is like a probability distribution, and "collapse" is just updating on the basis of new experimental facts (the electron is seen at a certain location, so the wavefunction should be "collapsed" to that point, in order to reflect the facts). I think it's von Neumann who introduced wavefunction realism into the Copenhagen interpretation (when he axiomatized QM), and thereby the idea of "observer-induced collapse of the wavefunction" as an objective physical process. Though wavefunction realism was always going to creep up on physicists, since they describe everything with wavefunctions (or state vectors) and habitually refer to these as "the state" of the object, rather than "the state of our knowledge" of the object; also because Copenhagen refused to talk about unobserved realities (e.g. where the electron is, when it's not being seen to be somewhere), an attitude which was regarded as prim positivistic virtue by the founders, but which created an ontological vacuum that was naturally filled by the de-facto wavefunction realism of physics practice.

Comment author: Eliezer_Yudkowsky 15 August 2012 07:57:26PM 8 points [-]

BTW, it's important to note that by some polls an actual majority of theoretical physicists now believe in MWI, and this was true well before I wrote anything. My only contributions are in explaining the state of the issue to nonphysicists (I am a good explainer), formalizing the gross probability-theoretic errors of some critiques of MWI (I am a domain expert at that part), and stripping off a lot of soft understatement that many physicists have to do for fear of offending sillier colleagues (i.e., they know how incredibly stupid the Copenhagen interpretation appears nowadays, but will incur professional costs from saying it out loud with corresponding force, because there are many senior physicists who grew up believing it).

The idea that Eliezer Yudkowsky made up the MWI as his personal crackpot interpretation isn't just a straw version of LW, it's disrespectful to Everett, DeWitt, and the other inventors of MWI. It does seem to be a common straw version of LW for all that, presumably because it's spontaneously reinvented any time somebody hears that MWI is popular on LW and they have no idea that MWI is also believed by a plurality and possibly a majority of theoretical physicists and that the Quantum Physics Sequence is just trying to explain why to nonphysicists / formalize the arguments in probability-theoretic terms to show their nonambiguity.

Comment author: aaronsw 18 August 2012 07:14:06PM 6 points [-]

Has anyone seriously suggested you invented MWI? That possibility never even occurred to me.

Comment author: Eliezer_Yudkowsky 18 August 2012 09:53:47PM 6 points [-]

It's been suggested that I'm the one who invented the idea that it's obviously true rather than just one more random interpretation; or even that I'm fighting a private war for some science-fiction concept, rather than being one infantry soldier in a long and distinguished battle of physicists. Certainly your remark to the extent that "he should try presenting his argument to some skeptical physicists" sounds like this. Any physicist paying serious attention to this issue (most people aren't paying attention to most things most of the time) will have already heard many of the arguments, and not from me. It sounds like we have very different concepts of the state of play.

Comment author: Mitchell_Porter 16 August 2012 03:50:45AM *  6 points [-]

by some polls

The original source for that "58%" poll is Tipler's The Physics of Immortality, where it's cited (chapter V, note 6) as "Raub 1991 (unpublished)". (I know nothing about the pollster, L. David Raub, except that he corresponded with Everett in 1980.) Tipler says that Feynman, Hawking, and Gell-Mann answered "Yes, I think the MWI is true", and he lists Weinberg as another believer. But Gell-Mann's latest paper is a one-history paper, Weinberg's latest paper is about objective collapse, and Feynman somehow never managed to go on record anywhere else about his belief in MWI.

Comment author: paper-machine 16 August 2012 04:08:09AM 1 point [-]

I trust Tipler as far as I can throw his book.

(It's a large book, and I'm not very strong.)

Comment author: Quantumental 17 August 2012 03:41:54PM *  3 points [-]

I just can't ignore this. If you take a minute to actually look at the talk section of that wikipedia page you will see those polls being thorn to pieces.

David Deutsch himself has stated that less than 10% of the people doing quantum fundamentals believe in MWI and then within that minority there are a lot of diverging views. So this is still not by any means a "majority interpretation".

As Mitchell_Porter has pointed out Gell-Mann certainly do not believe in MWI. Nor do Steven Weinberg, he denounced his 'faith' in it in a paper last year. Feynman certainly did never talk about it, which to me is more than enough indication that he did not endorse it. Hawking is a bit harder, he is on record seemingly being pro and con it, so I guess he is a fence sitter.

But more importantly is the fact that none of the proponents agree on what MWI they support. (This includes you Eliezer)

Zurek is another fence sitter, partly pro-some-sort-of-MWI, partly pro-It-from-Bit. Also his way of getting the Born Rule in MWI is quite a bit different. From what I understand, only the worlds that are "persistent" are actualized. This reminds me of Robin Hanson's mangled worlds where only some worlds are real and the rest gets "cancelled" out somehow. Yet they are completley different ways of looking at MWI. Then you got David Deutsch's fungible worlds which is slightly different from David Wallace's worlds. Tegmark got his own views etc.

There seems to be no single MWI and there has been no answer to the Born Rule.

So I want to know why you keep on talking about it as it is a slam dunk?

Comment author: Peterdjones 09 January 2013 11:38:17PM 0 points [-]

Good question.

Comment author: fezziwig 15 August 2012 08:34:41PM 3 points [-]

I think your use of "believe in" is a little suspect here. I'm willing to believe that more than half of all theoretical physicists believe some variant of the MWI is basically right (though the poll can't have been that recent if Feynman was part of it, alas), but that's different from the claim that there are no non-MWI interpretations worth considering, which is something a lot of people, including me, seem to be taking from the QP sequence. Do you believe that that's a majority view, or anything close to one? My impression is that that view is very uncommon, not just in public but in private too...at least outside of Less Wrong.

Comment author: Eliezer_Yudkowsky 15 August 2012 08:37:24PM 0 points [-]

That sounds correct to me. A physicist who also possesses probability-theory expertise and who can reason with respect to Solomonoff Induction and formal causal models should realize that single-world variants of MWI are uniformly unworkable (short of this world being a runtime-limited computer simulation); but such is rare (though not unheard-of) among professional physicists; and among the others, you can hardly blame them for trying to keep an open mind.

Comment author: shminux 15 August 2012 09:58:20PM 2 points [-]

single-world variants of MWI are uniformly unworkable

The Penrose's objective collapse theory saying that the entanglement scale is limited by gravity, which results in the macroscopic objects remaining essentially classical, does not look all that unworkable.

Comment author: Eliezer_Yudkowsky 16 August 2012 02:47:41AM 4 points [-]

It'd still be the only FTL discontinuous non-differentiable non-CPT-symmetric non-unitary non-local-in-the-configuration-space etc. etc. process in all of physics, to explain a phenomenon (why do we see only one outcome?) that doesn't need explaining.

Comment author: shminux 16 August 2012 04:32:36AM 1 point [-]

Well, one advantage of it is that it is testable, and so is not a mere interpretation, which holds a certain amount of appeal to the more old-fashioned of us.

Comment author: Eliezer_Yudkowsky 16 August 2012 07:17:01AM 5 points [-]

I agree, and I myself was, and am still, sentimentally fond of Penrose for this reason, and I would cheer on any agency that funded a test. However and nonetheless, "testable" is not actually the same as "plausible", scientifically virtuous as it may be.

Comment author: V_V 28 August 2012 05:19:25PM 0 points [-]

I'm puzzled. What does Solomonoff Induction have to say about experimentally undistinguishable (as far as we can practically test, at least) interpretations of the same theory?

Comment author: Eliezer_Yudkowsky 06 August 2012 11:38:39PM 5 points [-]

There were plenty of physicists reading those posts when they first came out on OB (the most famous name being Scott Aaronson). Some later readers have indeed asserted that there's a problem involving a physically wrong factor of i in the first couple of posts (i.e. that's allegedly not what a half-silvered mirror does to the phase in real life), which I haven't yet corrected because I would need to verify with a trusted physicist that this was correct, and then possibly craft new illustrations instead of using the ones I found online, and this would take up too much time relative to the point that talking about a phase change of -1 instead of i so as to be faithful to real-world mirrors is an essentially trivial quibble which has no effect on any larger points. If anyone else wants to rejigger the illustration or the explanation so that it flows correctly, and get Scott Aaronson or another known trusted physicist to verify it, I'll be happy to accept the correction.

Aside from that, real physicists haven't objected to any of the math, which I'm actually pretty darned proud of considering that I am not a physicist.

Comment author: CarlShulman 07 August 2012 02:17:34AM 15 points [-]

There were plenty of physicists reading those posts when they first came out on OB (the most famous name being Scott Aaronson)

As Scott keeps saying, he's not a physicist! He's a theoretical computer scientist with a focus on quantum computing. He clearly has very relevant expertise, but you should get his field right.

Comment deleted 09 August 2012 08:29:02AM *  [-]
Comment author: Peterdjones 09 January 2013 11:53:46PM 1 point [-]

It's like Greeks trying to do physics by pure reasoning. They got atoms right because of salt crystallizing,

Obviously, observeing salt is not prure reasoning. Very little philsophy is pure reasoning, the salient distinction is between informal, everyday observation and deliberately arranged experiements.

Comment author: Quantumental 08 August 2012 12:00:15PM 3 points [-]

I still wonder why you haven't written a update in 4 years regarding this topic. Especially in regards to the Born Rule probability not having a solution yet + the other problems.

You also have the issue of overlap vs non-overlapping of worlds, which again is a relevant issue in the Many Worlds interpretation. Overlap = the typical 1 world branching into 2 worlds. Non-overlap = 2 identical worlds diverging (Saunders 2010, Wilson 2005-present)

Also I feel like the QM sequence is a bit incomplete when you do not give any thought to things like Gerard 't Hoofts proposal of a local deterministic reality giving rise to quantum mechanics from a cellular automaton at the planck scale? It's misleading to say the MWI is "a slam dunk" winner when there are so many unanswered questions. Mitchell Porter is one of the few persons here who seem to have a deep understanding of the subject before reading your sequence, so he has raised some interesting points...

Comment author: John_Maxwell_IV 06 August 2012 10:02:25PM *  0 points [-]

I agree that EY is probably overconfident in MWI, although I'm uniformed about QM so I can't say much with confidence. I don't think it's accurate to damn all of Less Wrong because of this. For example, this post questioning the sequence was voted up highly.

I don't think EY claims to have any original insights pointing to MWI. I think he's just claiming that the state of the evidence in physics is such that MWI is obviously correct, and this is evidence as to the irrationality of physicists. I'm not too sure about this myself.

As for why SI's approach is dangerous, I think Holden put it well in the most upvoted post on the site.

Well there have been responses to that point (here's one). I wish you'd be a bit more self-skeptical and actually engage with that (ongoing) debate instead of summarizing your view on it and dismissing LW because it largely disagrees with your view.

Comment author: aaronsw 06 August 2012 10:43:02PM 5 points [-]

It seems a bit bizarre to say I've dismissed LessWrong given how much time I've spent here lately.

Comment author: John_Maxwell_IV 07 August 2012 12:49:21AM 0 points [-]

Fair enough.

Comment author: mwengler 15 March 2012 10:22:01PM 1 point [-]

The c-word is too strong for what LW actually is. But "rational" is not a complete descriptor either.

It is neither rational nor irrational to embrace cryonics. It may be rational to conclude that someone who wants to live forever and believes body death is the end of his life will embrace cryonics and life extension technologies.

It is neither rational nor irrational to vaunt current human values over any other. It is most likely that current human values are a snapshot in the evolution of humans, and as such are an approximate optimum in a natural selection sense for an environment that existed 10,000 years ago. The idea that "we" lose if we change our values seems more rooted in who "we" decide "we" are. Presumably in the past a human was more likely to have a narrower definition of "we" to include only a few hundred or a few thousand culture-mates. As time has gone on, "we" has grown to cover nationalities, individual races, pan-national, pan-race, for most people. Most Americans don't identify American with a particular race or national background, and many of us don't even require being born within the US or of US parents to be part of "we." Why wouldn't we extend our concept of "we" to include mammals, or all life that evolved on earth, or even all intelligences that evolved or were created on earth? Why would we necessarily identify a non-earth intelligence as "they" and not "we" as in "we intelligences can stick together and do a better job exploting the inanimate universe."

Rationality is a tool, not an answer. Having certain value decisions vaunted over others restricts LessWrong to being a community that uses rationality rather than a community of rationalists or a community serving all who use rationality. It is what Buffett calls "an unforced error."

Let the downvotes begin! To be clear, I don't WANT to be downvoted, but my history on this site suggests to me that I might be.

Comment author: Eneasz 16 March 2012 10:40:28PM 1 point [-]

This comment will be heavy with jargon, to convey complex ideas with the minimum required words. That is what jargon is for, after all. The post's long enough even with this shortening.

Less Wrong inspires a feeling of wonder.

To see humans working seriously to advance the robot rebellion is inspiring. To become better, overcome the programming laid in by Azathoth and actually improve our future.

The audacity to challenge death itself, to reach for the stars, is breathtaking. The piercing insight in many of the works here is startling. And the gift of being able to find joy in the merely real again is priceless. It doesn't hurt that it's spearheaded by an extremely intelligent and honest person who's powers of written communication are among the greatest of his generation.

And that sense of awe and wonder makes people flinch. Especially people who have been trained to be wary of that sort of shit. Who've seen the glassed-over eyes of their fundamentalist families and the dazed ramblings of hippies named Storm. As much as HJPEV has tried to train himself to never flinch away from the truth, to never let his brain lie to him, MANY of us have been trained just as strongly to always flinch away from awe and wonder produced by charismatic people. In fact, if we had a "don't let your brain lie to you" instinct as strong as our "don't let awe and wonder seduce you into idiocy" instinct we'd be half way to being good rationalists already.

And honestly, that instinct is a good one. It saves us from insanity 98% of the time. But it'll occasionally result in a woo/cult-warning where one could genuinely and legitimately feel wonder and awe. I don't blame people for trusting their instincts and avoiding the site. And it'll mean we forever get people saying "I dunno what it is, but that Less Wrong site feels kinda cultish to me."

We're open, we're transparent, we are a positive force in the lives of our members. We've got nothing to fear, and that's why occasional accusations of cultishness will never stick. We've just got to learn to live with the vibe and realize that those who stick around long enough to look deeper will bear out that we're not.

It's nice to still have that awe and wonder somewhere. I wouldn't ever want to give that up just so a larger percentage of the skeptic community accepts us. That feeling is integral to this site, giving it up would kill LW for me.

Comment author: Jakeness 06 October 2013 08:30:24PM 0 points [-]

I think this post can be modified, without much effort, to defend any pseudo-cult, or even a cheesy movie.

Comment author: Onelier 16 March 2012 07:27:56AM 1 point [-]

What paper or text should I read to convince me y'all want to get to know reality? That's a sincere question, but I don't know how to say more without being rude (which I know you don't mind).

Put another way: What do you think Harry Potter (of HPMOR) would think of the publications from the Singularity Institute? (I mean, I have my answer).

Comment author: ShortName 26 March 2012 05:32:03PM 1 point [-]

Long time lurker, I think LW is not capable enough as a social unit to handle it's topic and I currently view that participating in LW is not a good way to efficiently drive it's goals.

In order to reach a (hostile) audience one needs to speak the language. However ambient ways of carrying out discussion are often intermingling status / identity / politics with epistemology. In order to forward a position that biased / faith / economy based thinking are not epistemologically efficient tools one needs to make at least the initial steps in this twisted up "insane troll logic" . The end product is to reject the premise the whole argument stands on but it will never be produced if the thinking doesn't get started. In making it public and praising this kind of transitioning of modes of thinking, a lot of the machinery temporarily required to play the drama out gets reinforced into a kind of bedrock. It complicates matters people are simultaneously in need of completing a particular step while others need to dispel them. Thus there is a tendency to fixate on a "development step" relevant to the majority and becoming hostile to everything else.

I don't see the need to profess stances on things if the relevant anticipations work correctly. Coding the keys of insights on a single namespace and honing them to work against a static context makes applying and discussing them in other contexts needlessly complex. If someone knows a bias / heuristic / concept by some other name and that makes a LW participant not recognize or fail to apply things that they have learnt the password for, LW has managed to isolate insights from their usual and most needed application area.

Things that "hardcore" pursuits find valuable are passed "as is" or "as finalized by AwesomeDude42". This is faith based "cause they say so". Hooked by the "quality of the merchandise" this communal activity is more of a distribution system of those closed packages of tools rather than an epistemic engine in it's own right. I think that even school should be a place of learning rather than a place to receive data about what others have learned.

Because there is a caliber difference not all members can follow or participate in the production of the "good stuff" they wait to be distributed right out of the oven. Doing a passive "level up" handbook in the form of sequences still leaves a big "you must be this tall to participate in this facet of this community". There is no escaping the cognitive work of the individual but LW functions more as a price rather than the workbench.

The activity of LW is limited in a content-independent way by social structure in areas that it wishes to be more. This is not the optimal venue of thinking, but that shouldn't come as a big surprise.

Comment author: Craig_Heldreth 20 March 2012 11:37:26PM *  1 point [-]

Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?

Yes. I know a couple of people with whom I share interest in Artificial Intelligence (this is my primary focus in loading Less Wrong web pages) who communicated to me that they did not like the site's atmosphere. Atmosphere is not exactly the word they used. One person thought the cryonics was a deal breaker. (If you read the piece in the New York Times Sunday Magazine about Robin Hanson and his wife you will get a good idea of the global consensus distaste for the topic.) The other person was not so specific although it was clear they were turned off completely even if they couldn't or wouldn't explain how.

Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.

It is obvious that the culture here would be different if the more controversial or unpopular topics were downplayed enough not to discourage people who don't find the atmosphere convivial.

If so, can you suggest any easy steps we could take?

Here is what I have personally heard or read in comments that people find most bothersome: cryonics, polyamory, pick up artistry, density of jargon, demographic homogeneity (highly educated white males). Any steps to water that down beyond those already taken (pick up artistry is regularly criticized and Bell Curve racial IQ discussion has been all but tabooed) would not be easy to implement quickly and would have consequences beyond making for a more inclusive atmosphere.

I am not in agreement with the suitability of the word cult to characterize this issue accurately. I did the google test you describe and was surprised to see cult pop up so fast, but when I think cult I think Hare Krishnas, I think Charles Manson, I think David Koresh; I don't think Singularity Institute, and I don't think about a number of the organizations on Rick Ross' pages. Rick Ross is a man whose business makes money by promoting fear of cults. The last time I looked he had Landmark Education listed as a cult; this might be true with an extremely loose definition of the word but they haven't killed anybody yet to the best of my knowledge. I have taken a couple of courses from them and the multi-level marketing vibe is irksome but they have some excellent (and rational!) content in their courses. The last time I looked Ross did not have the Ordo Templi Orientis listed as a cult. When I was a member of that organization there were around a couple of thousand dues paying members in the United States, so I presume the OTO cult (this word is far more appropriately applied to them than Landmark) is too small for him to spend resources on.

The poster who replied that he and his wife refer to his Less Wrong activity as his cult membership is understandable to me in a light and humorous manner; I would be surprised if they really classify Less Wrong with Scientology and Charles Manson.

Comment author: whpearson 15 March 2012 11:15:17AM 1 point [-]

I got a possible proto-small religion feeling from SL4 discussions with Eliezer and SI folk back in the day. Any possible cultishness feeling was with a small c, that is not harmful to the participants accept for their bank balance, as in the use of the word cult in cult following. There isn't a good word for this type of organization, which is why it gets lumped in with Cults.

Less wrong is better than SL4 for this feeling anyway.

Comment author: confusednewbie 23 August 2014 04:49:18PM 0 points [-]

Well, it's nice to know at least you guys see it. Yes, that was one of my reactions. I started reading some of the sequences (which really aren't put at a level that the mass public, or, I'd hazard to say, though not with certainty, even people whose IQs don't fall one standard deviation above the mean or higher can easily understand). I liked them, though I didn't fully understand them, and have referred people to them. However, at the time I was looking into a job and did some kind of search through the website. Anyways, I encountered a post with a person who was asking for advice on a job...I can't find it now, but from what I remember (this has been a long time, the memory is greatly degraded, but I think what little I remember may be actually more insightful in this case than a faithful representation of the actual post) the poster talked about divorce and doing a job they hated and the like to be able to donate more to charity, and how that was an acceptable though not valued trade-off. And, though I actually agree to a point, that did raise HUGE red flags in my mind for cult...particularly when combined with the many messages that seem to be embedded here about donating to the LW non-profits. I fled after reading that and stayed away for a long time. I dunno if it helps or not, but figured I'd share.

Also, I just Googled "Less Wrong" and all I did was added a space and Google auto-suggested cult. So things seem to have worsened since this was published.

Comment author: Grognor 16 March 2012 09:40:26AM 0 points [-]