AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.
I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:
Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!
&
Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.
Really, am I the only one seeing the problem with this?
People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they p...
LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.
How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?
"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."
I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.
(the way Will_Newsome wants you to),
I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.
There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person
You mean when he saw himself in the mirror? :)
Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?
I think it's not an ethical imperative unless you're unusually altruistic.
Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.
I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.
False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.
Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.
(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)
I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.
I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.
If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.
I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...
This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.
— Nick Tarleton's twist on T.S. Eliot
My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.
In other words, I'm talking about first impressions, not in-depth discussions.
I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.
Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)
And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think ...
My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.
Okay.
If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.
I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.
I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.
Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.
Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).
Criticism rocks dude.
Okay. (I hope you didn't interpret anything I said as meaning otherwise.)
It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.
AAAAARRRGH! I am sick to death of this damned topic.
It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.
A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:
"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"
Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)
"LW is a cult hur hur"
People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.
"since it's exactly what Eliezer was trying to combat by writing it."
There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.
The top autocompletes for "Less Wrong" are
These are my (logged-in) Google results for searching "Less Wrong_X" for each letter of the alphabet (some duplicates appear):
Google's autocomplete has a problem, which has produced controversy in other contexts: when people want to know whether X is trustworthy, the most informative search they can make is "X scam". Generally speaking, they'll find no results and that will be reassuring. Unfortunately, Google remembers those searches, and presents them later as suggestions - implying that there might be results behind the query. Once the "X scam" link starts showing up in the autocomplete, people who weren't really suspicious of X click on it, so it stays there.
Eliezer addressed this in part with his "Death Spiral" essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:
Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one's approach just for appearances.
Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.
Oh yes, and fact that the leader wants to SAVE THE WORLD.
But they're not buying malaria nets, they're doing thought-work. Do you expect to see an invoice for TDT?
Quite appart from the standard complaint about how awful a metric that is.
Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.
For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.
AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.
There's also the pro...
Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.
Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).
Upvoted for sounding a lot like the kinds of complaints I've heard people say about LW and SIAI.
There is a large barrier to entry here, and if we want to win more, we can't just blame people for not understanding the message. I've been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:
We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren't exactly signalling openness or friendliness.
We have a tendency to preach to the converted. Or as the friend said:
It's that classic mistake of talking in a way where you're convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.
He brought up an example for how material might be introduced to newly exposed folk.
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.
The curse of knowledg...
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.
That's an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary-- when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I'd heard in the course of a lot of years of listening to NPR.
I'm not saying it's so magical that it can't be equaled, I'm saying that it might be worth studying.
Here's what an outsider might see:
"doomsday beliefs" (something "bad" may happen eschatologically, and we must work to prevent this): check
a gospel (The Sequences): check
vigorous assertions of untestable claims (Everett interpretation): check
a charismatic leader extracting a living from his followers: check
is sometimes called a cult: check
This is enough to make up a lot of minds, regardless of any additional distinctions you may want to make, sadly.
I'm here for only a couple of months, and I didn't have any impression of cultishness. I saw only a circle of friends doing a thing together, and very enthusiastic about it.
What I also did see (and still do) is specific people just sometimes being slightly crazy, in a nice way. As in: Eliezer's threatment of MWI. Or way too serious fear of weird acausal dangers that fall out of currently best decision theories.
Note: this impression is not because of craziness of the ideas, but because of taking them too seriously too early. However, the relevant posts always have sane critical comments, heavily upvoted.
I'm slightly more alarmed by posts like How would you stop Moore's Law?. I mean, seriously thinking of AI dangers is good. Seriously considering nuking Intel's fabs in order to stop the dangers is... not good.
Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.
An acquaintance of mine attends Less Wrong meetups and describes most of his friends as being Less Wrongers, but doesn't read Less Wrong and privately holds reservations about the entire singularity thing, saying that we can't hope to say much about the future more than 10 years in advance. He told me that one of his coworkers is also skeptical of the singularity.
A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".
I was interviewing for an internship once, and the interviewer and I realized we had a mutual acquaintance who was a Less Wronger and SI donor. He asked me if I was part of that entire group, and I said yes. His attitude was a bit derisive.
Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that's exactly what a cult would do, isn't it?
To be accused is to be convicted, because the allegation is unfalsifiable.
Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.
The best way to win is to avoid the topic.
Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: "So, what exactly is that evil thing people on LW did? Downvote someone's forum post? Seriously, guys, you need to get some life."
And now, everybody stop worrying and get some life. ;-)
It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discu...
People discussing best diet habits don't seem like a doomsday cult, right?
I'm having trouble thinking up examples of cults, real or fictional, that don't take an interest in what their members eat and drink.
Some things that might be problematic:
We use the latest insights from cognitive science, social psychology, probability theory, and decision theory to improve our understanding of how the world works and what we can do to achieve our goals.
I don't think we actually do that. Insights, sure, but latest insights? Also, it's mostly cognitive science and social psychology. The insights from probability and decision theory are more in the style of the simple math of everything.
Want to know if your doctor's diagnosis is correct? It helps to understand Bayes' Theorem.
This might sound weird to someone who hasn't already read the classic example about doctors not being able to calculate conditional probabilities. Like we believe Bayes theorem magically grants us medical knowledge or something.
[the link to rationality boot-camp]
I'm not a native speaker of English so I can't really tell, but I recall people complaining that the name 'boot-camp' is super creepy.
On the about page:
Introduce yourself to the community here.
That's not cultish-sounding but it's unnecessarily imperative. Introduction thread is optional.
Disclaimer: My partner and I casually refer to LW meetups (which I attend and she does not) as "the cult".
That said, if someone asked me if LW (or SIAI) "was a cult", I think my ideal response might be something like this:
"No, it's not; at least not in the sense I think you mean. What's bad about cults is not that they're weird. It's that they motivate people to do bad things, like lock kids in chain-lockers, shun their friends and families, or kill themselves). The badness of being a cult is not being weird; it's doing harmful things — and, secondarily, in coming up with excuses for why the cult gets to do those harmful things. Less Wrong is weird, but not harmful, so I don't think it is a cult in the sense you mean — at least not at the moment.
"That said, we do recognize that "every cause wants to be a cult", that human group behavior does sometimes tend toward cultish, and that just because a group says 'Rationality' on the label does not mean it contains good thinking. Hoping that we're special and that the normal rules of human behavior don't apply to us, would be a really bad idea. It seems that staying self-critical, understanding how ...
What's bad about cults is not that they're weird. It's that they motivate people to do bad things...
People use "weird" as a heuristic for danger, and personally I don't blame them, because they have good Bayesian reasons for it. Breaking a social norm X is positively correlated with breaking a social norm Y, and the correlation is strong enough for most people to notice.
The right thing to do is to show enough social skill to avoid triggering the weirdness alarm signal. (Just like publishing in serious media is the right thing to avoid the "pseudoscience" label.) You cannot expect that outsiders will do an exception for LW, suspend their heuristics and explore the website deeply; that would be asking them to privilege a hypothesis.
If something is "weird", we should try to make it less weird. No excuses.
Often by the time a cult starts doing harmful things, its members have made both real and emotional investments that turn out to be nothing but sunk costs. To avoid ever getting into such a situation, people come up with a lot of ways to attempt to identify cults based on nothing more than the non-harmful, best-foot-forward appearance that cults first try to project. If you see a group using "love bombing", for instance, the wise response is to be wary - not because making people feel love and self-esteem is inherently a bad thing, but because it's so easily and commonly twisted toward ulterior motives.
Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
What do you mean, "initially" ? I am still getting that impression ! For example, just count the number of times Eliezer (who appears to only have a single name, like Prince or Jesus) is mentioned in the other comments on this post. And he's usually mentioned in the context of, "As Eliezer says...", as though the mere fact that it is Eliezer who says these things was enough.
The obvious counter-argument to the above is, "I like the things Eliezer says because they make sense, not because I worship him personally", but... well... that's what one would expect a cultist to say, no ?
Less Wrongers also seem to have their own vocabulary ("taboo that term or risk becoming mind-killed, which would be un-Bayesian"). We spend a lot of time worrying about doomsday events that most people would consider science-fictional (at best). We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality. As far as warning signs go, we've got it covered...
Specialized terminology is really irritating to me personally, and off-putting to most new visitors I would think. If you talk to any Objectivists or other cliques with their own internal vocabulary, it can be very bothersome. It also creates a sense that the group is insulated from the rest of the world, which adds to the perception of cultishness.
Agreed. I realize that the words like "litany" and "conspiracy" are used semi-ironically, but a newcomer to the site might not.
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.
The LW FAQ says: >
Why do you all agree on so much? Am I joining a cult?
We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all. Yes, some of the results that fall out of these basics sound weird if you haven't seen the reasoning behind them, but there's nothing in the laws of physics that prevents reality from sounding weird.
I suspect that putting a more human face on the front page, rather than just linky text, would help.
Perhaps something like a YouTube video version of the FAQ, featuring two (ideally personable) people talking about what Less Wrong is and is not, and how to get started on it. For some people, seeing is believing. It is one thing to tell them there are lots of different posters here and we're not fanatics; but that doesn't have the same impact as watching an actual human with body cues talking.
I don't believe LW is a cult, but I can see where intelligent, critical thinking people might get that impression. I also think that there may be elitist and clannish tendencies within LW that are detrimental in ways that could stand to be (regularly) examined. Vigilance against irrational bias is the whole point here, right? Shouldn't that be embraced on the group level as much as on an individual one?
Part of the problem as I see it is that LW can't decide if it's a philosophy/science or a cultural movement.
For instance, as already mentioned, there's a great deal of jargon, and there's a general attitude of impatience for anyone not thoroughly versed in the established concepts and terminology. Philosophies and sciences also have this problem, but the widely accepted and respected philosophical and scientific theories have proven themselves to the world (and weren't taken very seriously until they did). I personally believe there's a lot of substance to the ideas here, but LW hasn't delivered anything dramatic to the world at large. Until it does so it may remain, in the eyes of outsiders, as some kind of hybrid of Scientology and Objectivism - an insular group of people with a s...
In general, I think we could stand more community effort being put into optimizing our about page, which you can do now here.
Thank you for this.
(In light of my other comment, I should emphasize that I really mean that. It is not sarcasm or any other kind of irony.)
I have seen this problem afflict other intellectually-driven communities, and believe me, it is a very hard problem to shake. Be grateful we aren't getting media attention. The adage, "All press is good press", has definitely been proven wrong.
The word "cult" never makes discussions like these easier. When people call LW cultish, they are mostly just expressing that they're creeped out by various aspects of the community - some perceived groupthink, say. Rather than trying to decide whether LW satisfies some normative definition of the word "cult," it may be more productive to simply inquire as to why these people are getting creeped out. (As other commenters have already been doing.)
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
Somebody please do so. Those examples are just obviously bad.
I got a distinct cultish vibe when I joined, but only from the far-out parts of the site, like UFAI, but not from the "modern rationality" discussions. When I raised the issue on #lesswrong, the reaction from most regulars was not very reassuring: somewhat negative and more emotional than rational. The same happened when I commented here. That's why I am looking forward to the separate rationality site, without the added untestable and useless to me EY's idiosyncrasies, such as the singularity, the UFAI and the MWI.
As I see it Cult =Clique + Weird Ideas
I think the weird ideas are an integral part of LessWrong, and any attempt to disguise them with a fluffy introduction would be counterproductive.
What about Cliquishness? I think that the problem here is that any internet forum tends to become a clique. To take part you need to read through lots of posts, so it requires quite a commitment. Then there is always some indication of your status within the group - Karma score in this case.
My advice would be to link to some non-internet things. Why not have the FHI news feed and links to a few relevant books on Amazon in the column on the right?
"Twelve Virtues of Rationality" has always seemed really culty to me. I've never read it, which may be part of the reason. It just comes across as telling people exactly how they should be, and what they should value.
Also, I've never liked that quote about the Sequences. I agree with it, what I've read of the sequences (and it would be wrong to not count HPMOR in this) is by far the most important work I've ever read. But that doesn't mean that's what we should advertise to people.
Do we want to scare away people with strong weirdness filters?
The answer to this may very well turn out to be yes.
Ironically, I suspect the "cultlike" problem is that LessWrong/SI's key claims lack falsifiability.
Friendly AI? In the far future.
Self-improvement? All mental self-improvement is suspected of being a cult, unless it trains a skill outsiders are confident they can measure.
If I have a program for teaching people math, outsiders feel they know how they can check my claims - either my graduates are good at math or not.
But if I have a program for "putting you in touch with your inner goddess", how are people going to check my claims? For all...
you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency.
Maybe a substantial number of people are searching for the posts about cultishness.
I think the biggest reason Less Wrong seems like a cult is because there's very little self-skepticism; people seem remarkably confident that their idiosyncratic views must be correct (if the rest of the world disagrees, that's just because they're all dumb). There's very little attempt to provide any "outside" evidence that this confidence is correctly-placed (e.g. by subjecting these idiosyncratic views to serious falsification tests).
Instead, when someone points this out, Eliezer fumes "do you know what pluralistic ignorance is, and Asch...
it appears the physicists who have happened to run across his argument found it severely flawed
The criticisms at those links have nothing to do with the argument for MWI. They are just about a numerical mistake in an article illustrating how QM works.
The actual argument for MWI that is presented is something like this: Physicists believe that the wavefunction is real and that it collapses on observation, because that is the first model that explained all the data, and science holds onto working models if they are falsified. But we can also explain all the data by saying that the wavefunction is real and doesn't collapse, if we learn to see the wavefunction as containing multiple worlds that are equally real. The wavefunction doesn't collapse, it just naturally spreads out into separate parts and what we see is one of those separate parts. A no-collapse theory is simpler than a collapse theory because it has one less postulate, so even though there are no new predictions, by Bayes (or is it Occam?) we can favor the no-collapse theory over the collapse theory. Therefore, there are many worlds.
This is informal reasoning about which qualitative picture of the world to favor, so i...
by some polls
The original source for that "58%" poll is Tipler's The Physics of Immortality, where it's cited (chapter V, note 6) as "Raub 1991 (unpublished)". (I know nothing about the pollster, L. David Raub, except that he corresponded with Everett in 1980.) Tipler says that Feynman, Hawking, and Gell-Mann answered "Yes, I think the MWI is true", and he lists Weinberg as another believer. But Gell-Mann's latest paper is a one-history paper, Weinberg's latest paper is about objective collapse, and Feynman somehow never managed to go on record anywhere else about his belief in MWI.
There were plenty of physicists reading those posts when they first came out on OB (the most famous name being Scott Aaronson)
As Scott keeps saying, he's not a physicist! He's a theoretical computer scientist with a focus on quantum computing. He clearly has very relevant expertise, but you should get his field right.
It's like Greeks trying to do physics by pure reasoning. They got atoms right because of salt crystallizing,
Obviously, observeing salt is not prure reasoning. Very little philsophy is pure reasoning, the salient distinction is between informal, everyday observation and deliberately arranged experiements.
It's a rather unavoidable side-effect of claiming that you know the optimal way to fulfill one's utility function, especially if that claim sounds highly unusual (Unite the human species in making a friendly AI that will create Utopia). There are many groups that make such claims, and either one or none of them can be right. Most people(Who haven't already bought into a different philosophy of life) think it's the later, and thus tend not to take someone seriously when they make extraordinary claims.
Until recognition of the Singularity's imminence and ne...
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult
Well, there's the folks at RationalWiki.
Long time lurker, I think LW is not capable enough as a social unit to handle it's topic and I currently view that participating in LW is not a good way to efficiently drive it's goals.
In order to reach a (hostile) audience one needs to speak the language. However ambient ways of carrying out discussion are often intermingling status / identity / politics with epistemology. In order to forward a position that biased / faith / economy based thinking are not epistemologically efficient tools one needs to make at least the initial steps in this twisted up &qu...
Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
Yes. I know a couple of people with whom I share interest in Artificial Intelligence (this is my primary focus in loading Less Wrong web pages) who communicated to me that they did not like the site's atmosphere. Atmosphere is not exactly the word they used. One person thought the cryonics was a deal breaker. (If you read the piece in the New York Times Sunday Magazine about Robin Hanso...
What paper or text should I read to convince me y'all want to get to know reality? That's a sincere question, but I don't know how to say more without being rude (which I know you don't mind).
Put another way: What do you think Harry Potter (of HPMOR) would think of the publications from the Singularity Institute? (I mean, I have my answer).
I got a possible proto-small religion feeling from SL4 discussions with Eliezer and SI folk back in the day. Any possible cultishness feeling was with a small c, that is not harmful to the participants accept for their bank balance, as in the use of the word cult in cult following. There isn't a good word for this type of organization, which is why it gets lumped in with Cults.
Less wrong is better than SL4 for this feeling anyway.
Well, it's nice to know at least you guys see it. Yes, that was one of my reactions. I started reading some of the sequences (which really aren't put at a level that the mass public, or, I'd hazard to say, though not with certainty, even people whose IQs don't fall one standard deviation above the mean or higher can easily understand). I liked them, though I didn't fully understand them, and have referred people to them. However, at the time I was looking into a job and did some kind of search through the website. Anyways, I encountered a post with a perso...
The c-word is too strong for what LW actually is. But "rational" is not a complete descriptor either.
It is neither rational nor irrational to embrace cryonics. It may be rational to conclude that someone who wants to live forever and believes body death is the end of his life will embrace cryonics and life extension technologies.
It is neither rational nor irrational to vaunt current human values over any other. It is most likely that current human values are a snapshot in the evolution of humans, and as such are an approximate optimum in a...
This comment will be heavy with jargon, to convey complex ideas with the minimum required words. That is what jargon is for, after all. The post's long enough even with this shortening.
Less Wrong inspires a feeling of wonder.
To see humans working seriously to advance the robot rebellion is inspiring. To become better, overcome the programming laid in by Azathoth and actually improve our future.
The audacity to challenge death itself, to reach for the stars, is breathtaking. The piercing insight in many of the works here is startling. And the gift of being a...
I have several questions related to this:
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Here are the worst violators I see on that about page:
And on the sequences page:
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.