Tenoke comments on Open Thread, October 27 - 31, 2013 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (382)
You are using a very loose definition of a cult. Surely you know that 'cult' carries some different (negative) connotations for other people?
It might not change what we are but it has some negative consequences. People like you who call us a cult while using a different meaning of 'cult' turn new members away because they hear that LessWrong is a cult and they don't hear your different meaning of the word (which excludes most of the negative traits of bloody cults).
Why a "bloody" cult? What image does "cult" summon in your mind? Cthulhu followers? Osho's Cadillac collection? The Kool Aid?
I'm beginning to see where you're going with this. Calling us a cult is like calling Marting Luther King a criminal. Technically correct, but misleading, because of the baggage the word carries.
We would do well, then, to list all the common traits and connotations of a cult, good and bad, and all the ways we are demonstrably different or better than that. That way, we'd have a ready-made response we could release calmly in the face of accusations of culthood, without going through the embarassing and stammering "yeahbut" kind of argument that gives others the opportuinity to act as inquisitors.
Nevertheless, I for one believe we shouldn't reject the word, but reappropriate it, if only to throw our critics off-balance. "Less Wrong is a cult!" "Fracking right it is!" "... Wait, you aren't denying it?" "Nope. But, please, elaborate, what's wrong with us being a cult, precisely?"
In British English "bloody" is a general-purpose intensifier, e.g. "That's just bloody lovely!" :-)
I know, I just thought maybe he meant the more literal kind of bloody, given the context (we're talking about cults, this site is dominated by US Citizens), and wanted him to clarify.
Conversely, we could skip the listing of traits and the construction of ready-made responses that we could release uniformly and the misleading self-descriptions, and move directly to "What would be wrong with that, if we were?"
Anyone who can answer that question has successfully tabooed "cult" and we can now move on to discussing their actual concerns, which might even be legitimate ones.
Engaging on the topic further with anyone who can't answer that question seems unlikely to be productive.
The problem with cults is that they tend to assign the probability of 1 to their priors.
Of course it's a general problem with religions, but quoting Ambrose Bierce's The Devil's Dictionary from memory...
Religion, n. -- a large successful cult.
Cult, n. -- a small unsuccessful religion.
Well, one of LW's dogmas seems to be that 0 and 1 are not probabilities, so...
Ah, but that's just the sort of thing we'd want you to think we believe, to throw you off the scent!
Do we assign the probability of 1 to our priors?
LW regulars are a diverse bunch. Though I have no evidence I am pretty sure some assign the probability of 1 to some priors.
Yes, I agree.
When you said that the problem with religions is that they tend to assign the probability of 1 to their priors, did you mean to include having some members who assign probability 1 to some priors in the category you were identifying as problematic?
Insofar the religion encourages or at least accepts those "some" members and stands behind them, and insofar these priors are important, yes.
OK, cool. Thanks for clarifying.
You make a very good point.
And has this nominally very good point changed your beliefs about anything related to this topic?
It changed my preferred method of approach slightly; I skip the "Yeah we're a cult" and go straight to the "So what?" It's a simple method: answer with a question, dodge the idiocy.
Cool.
Substitute 'bloody' for 'fucking' to get the intended meaning.
Might help but not calling ourselves a cult will probably lead to better PR.
Somewhat amusingly, there's another tangent somewhere in this discussion (about polyamory, recruiting sex partners, etc.) on which 'fucking cult' could also be over-literally interpreted.
I think there are too many superficial similarities for critics, opponents and trolls not to capitalize on them, leaving us in the awkward position of having to explain the difference to these self-styled inquisitors like we're somehow ashamed of ourselves.
It's not enough to agree not to call ourselves a cult (and not just because people will, willfully or unwittingly, break this agreement, probably frequently enough to make the policy useless for PR effects).
We need to have an actual plan to deal with it. I say proclaiming ourselves "the best cult in the world" functions as a refuge in audacity, causes people to stop their inquiry, listen up, think, because it breaks patterns.
Saying "we're not a cult, really, stop calling us a cult, you meanies" comes off as a suspicious denial, prompting a "that's what they all say" response, simply by pattern-matching to how a guilty party would usually behave. To make ourselves above suspicion, we need to behave differently than a guilty party would.
On a tangential note, I found it useful in raising the sanity waterline precisely among the sort of people who'd be suckers for cults, the sort that wouldn't go to a doctor and would prefer to resort to homeopathy or acupuncture. By presenting EY as my "guru" and using his more mystical-style works (the "Twelve Virtues of a Rationlist" for example), I managed to get them in contact with the values we preach rather than with the concrete, science-based notions (these guys are under the impression that Science Is Evil and humanity is Doomed to suffer Gaia's Revenge etc. etc.). With this, I hope to progressively introduce them to an ideology that values stuff that's actually proven to work, with an idealism and an optimism that is far apart from their Positivist-inspired view of Science as a cold and heartless exploitation machine.
I'm not very confident on that last bit, though, so I suppose I could easily be argued into dropping it.
That's not what I am saying at all. I am not saying that we should stop people from calling us a cult. I am saying that WE (or maybe YOU) should stop starting threads and conversations about how we are a cult, whether we are a cult and so on. As I said - if people on Lesswrong weren't questioning themselves whether they are in a cult (there is one such thread in this OT for example) which is ridiculous and weren't bring attention to it all the time the external cult-calling wouldn't be so strong either.
And again - to make it clearer: People, please stop bringing up the damn cult stuff all the time. Sure - respond to outsiders when they ask you whether we are a cult but don't start such a conversation yourself. And don't mention the cult bollocks when you are telling people about Lesswrong for the first time.
If the problem is the folks among us worrying about us being a cult, not talking about it will only make them worry more. Their concerns should be treated seriously ("Supposing we were a cult, what's wrong with that?" is indeed a good approach), no matter how stupid they may turn out to be, and they should be reassured with proper arguments, rather than dismissed out of hand. Intimidating outsiders into feeling stupid is, I think, a valid short-time tactic, but when it comes to our folks, we owe each other to examine things clearly.
Since the problem seems to pop up spontaneously as well as propagate memetically, I would suggest making an FAQ with all the common concerns, addressing them in a fair and conclusive manner, that will leave their minds at peace. And not in a short-term, fuzzily-reassuring bullshit kind of peace, but a "problem solved, question dissolved, muthahubber" kind of peace.
I assume you haven't read this and related posts and the countless other discussions on the topic? The topic has been over discussed already.. My problem is that people keep bringing it up all the time and people (and search engines) start associating 'lesswrong' with 'cult'.
Well then why don't you just link people to this every time you see the problem pop up? I certainly will.
Sorry, I'm going to be a freaking pedant here, but this is a bit of a pet peeve of mine. That is a physical impossibility. Please refrain from this kind of hyperbole and use the appropriate adjective; in this case, many. Thank you.
I can't count them = they are subjectively countless for me. Happy now?
Sure you could, you just have other stuff you'd rather do, which is totally okay :)
Tangentially... while encouraging others to provide links to relevant past discussions when a subject comes up is a fine thing, it ought not substitute for encouraging in ourselves the habit of searching for relevant past discussions before bringing a subject up.
Actually, a huge problem I have with LW is the sheer amount of discussions-inside-discussions we have. Especially in the Sequences, there's just too many comments to humanly read. If we could make summaries of the consensus on any specific topic, and keep them updated as discussions progress...
On a site like this, how do we tell the difference?
Accumulated karma is usually a good metric. The jargon, and the ideological equipment and epistemological approach, are also important signs to look out for. So is the degree of mean-spiritedness. Subjective is not the same as meaningless.
Gotcha.
For my own part I endorse intimidating people who demonstrate mean-spirited behavior into silence (whether by making them feel stupid, if that works, or some other mechanism). Depending on what you mean by "ideological equipment and epistemological approach", I might endorse the same tactic there as well.
Neither of those endorsements depends much on how long those people have been contributing, or how much karma they've accumulated, or what jargon they use.
Have to be careful about that -- if you're being trolled there is noticeable potential for an epic fail :-)
I think the word for a thing that started calling itself the best cult in the world is 'religion'.
When people ask me what religion I hail from (as far as I'm concerened, religion or religation is nothing more or less than RED Team VS BLU Team style affiliation, with, in the absence of exterior threats, a tendency to splinter and call heresy on each other), I tell them "secular humanist". As far as I'm concerned, LW is just a particularly interesting denomination of that faith. "We're the only religion whose beliefs are wholly grounded in empirical experience, and which, instead of praying for things to get better, go out and make them so".
Are there in fact no ("other") religions which endorse making things better?
I am aware of religious denominations which advocate doing good works as a route to personal salvation, but I honestly can't think of any religious branch I'm aware of which advocates good works on the basis of "For goodness' sake, look at this place, it's seriously in need of fixing up."
So, just to make sure I understand the category you're describing here... if, for example, an organization like the Unitarian Universalist Association of Congregations asserts as one of its guiding principles "The goal of world community with peace, liberty, and justice for all;" and does not make a statement one way or the other about the salvatory nature of those principles, is that an example of the category?
I guess I'd say that it counts if you're willing to treat Unitarian Universalism as an actual religious denomination. Whether it counts or not would probably depend on how you identify such things, since it's missing qualities which one might consider important, such as formal doctrines.
In my experience Unitarian Universalism, at least in its modern form, is mainly a conglomeration of liberal progressive ideals used as an umbrella to unite people with religious beliefs ranging from moralistic therapeutic deism to outright atheism.
All of the Unitarian Universalists I've known well enough to ask have also identified themselves as secular humanists, so I certainly wouldn't regard it as an alternative to secular humanism which carries that value.
As far as I know, they're all about giving up your ego in one way or another and happily wait for death or the endtimes. The most proactive they get is trying to spread this attitude around (but not too much; they still need other people to actually pay for their contemplative lifestyle). Making things better, improving the standing of humankind, cancelling the apocalypse? A futile, arrogant, doomed effort.
OK.
Do we?
Well, OK.
One place to start planning is by identifying desired outcomes, and then suggesting actions that might lead to those outcomes. So... what do we expect to have achieved, once we've dealt with it?
Another place to start, which is where you seem to be starting, is by arguing the merits of various proposed solutions.
That's usually not an ideal place to start unless our actual goal is to champion a particular set of actions, and the problem is being identified primarily in order to justify those actions. But, OK, if we're going down that path... you've identified two possible solutions:
And you've argued, compellingly, that #2 is a bad plan, with which I agree completely.
I will toss another contender into the mix:
3. Asking "assuming we are, so what?" and going on about our business.
There of course exist other options.
Desired outcome:
And I don't mean "relax", I mean stop worrying, by virtue of knowing for a fact that their concerns are unfounded.
You do understand that these are the desired outcomes of a bona fide cult as well, right?
If you kidnap virgins to sacrifice in jungle hideouts while waiting for the UFOs to arrive, that's what you'd want, too.
Yes, well, the desired outcome of both a criminal and an innocent when facing an investigation is to be found innocent. That they both share this trait is irrelevant to their guilt. An innocent certainly shouldn't start worrying about maybe being guilty just because he doesn't want to be found guilty, that's just stupid.
OK. Given that desired outcome, I'd suggest your next steps would be:
How to best go about step 3 will depend a lot on the results of step 1 and 2.
Do you have any theories about 1 and 2?
Things I worry about:
Some members make large donations
There is secret knowledge that you pay for (ai-box)
Members do some kooky things (cryonics, polyamory)
Members claim "rationality" has helped them lose weight or sleep better - subject things without controls - rather than something more measurable and where a mechanism is more obvious.
At least one thing is not supposed to be discussed in public (banned memetic hazard). LW members seem significantly kookier when talking about this (and in the original deleted thread) than on more public subjects.
Members have a lot of jargon. It can seem like they're speaking their own language. More, there's a bunch of literature embedded in the organization's worldview; publicly this is treated as simple fiction, but internally it's clearly taken more seriously.
Although there's no explicit severing advice, LW encourages (in practice if not in theory) members to act in ways that reduce their out-of-group friendships
The hierarchy is opaque; it feels like there is a clique of high-level users, but this is not public.
Don't forget that much of the inner circle actually draws a paycheck from the organizations members are encouraged to donate to, and supposedly a fairly large one at that, and that the discussion of how much to donate is framed in terms of averting the destruction of the human race.
That and the polyamory commune are the two sketchiest things IMO, since it shows that the inner circle is materially and directly benefiting from the "altruism" / "rationality" of lower ranking members.
This is a good website, mostly good people on it, but there's also an impression that there are questionable dealings going on behind the scenes.
Would LW be improved if paid employees/consultants of those organizations were barred from membership? (I do realize there are other ways to address this concern, and I don't mean to suggest otherwise... for example, we could re-institute the moratorium on discussing those organizations, or a limited moratorium on fundraising activitiies here, or various other things. I'm just curious about your opinion about that specific solution.)
I get a kick out of this, because my social circle is largely polyamorous but would mostly consider LW a charming bunch of complete nutjobs on other grounds. Polyamory really isn't all that uncommon in communities anchored around high-tech development/elite tech schools, IME.
Don't forget treating the writings of the charismatic founder as a sacred text and ritually quoting them :-)
Wouldn't you expect that if the cause actually made sense though? (and not only if this is a cult)
Less than 0.01% of the users have played an ai-box game (to my knowledge) and even less have played it for money.
Again fairly small subset for the first thing, slightly larger for the second but I guess I will give you that one.
Probably a tiny subset of users claim that - I personally have never seen anyone claim that rationality helped them sleep better and if you mean that evidence-based reasoning helped them find an intervention designed to increase sleep quality you are grasping for straws.
We are not supposed to write out the actual basilisk (there is only one) on lesswrong.com. There is no problems with talking about it in public and again this affects a tiny portion of users.
Giving you this one as well.
Bullshit.
There are just respected users and no clear-cut hierarchy - that's what happens at most places. For a proxy of who is a high-level user look at the 'Top Contributors'.
Given how much LWers seem to care about effective charity, I'd expect more scrutiny, and a stronger insistence on measurable outcomes. I guess you're right though; the money isn't inherently a problem.
It seems like a defining characteristic; it's one place where the site clearly differs from more "mainstream" AI research (though this may be a distorted perception since it was how I first heard of LW)
Shrug. It looks dodgy to me. It pattern-matches with e.g. the unverifiable stories people tell of their personal experience of Jesus.
That's not at all clear. I've never seen any explicit rules. I've seen articles that carefully avoid saying the name.
Even on internet forums there's usually an explicit distinction between mod and not, and often layers to it. (The one exception I know is HN, and even there people know who pg is, who's part of YC and who's not, and stories are presented differently if they're coming from YC members). And it's unusual and suspicious for the high-ups to all be on first name terms with each other. It raises questions over objectivity, oversight, conflict resolution.
This sort of point-by-point refutation is the same sort of thing that would happen in a church that was trying to defend against allegations of cultyness.
I don't think lmm's list of reasons was utterly compelling -- good, but not utterly compelling -- but I don't think it would matter if it were a perfect list, because there will always be a defense for accusations of cultyness that satisfies the church/forum.
It is more interesting watching it happen here vs. the church IMO because LW is all about rationality, where the church can always push the "faith" button when they are backed into a logical corner.
At the end of the day, it is just an online forum. But it does sound to me (based on what I can gather from perusing) like there are a group of people here who take this stuff seriously enough so as to make cultyness possible.
I'm sure the "LW/cryonics/transhumanism/basilisk stuff is so similar to how religion works" bit got old a long time ago, but Dear Lord is it apparent and fascinating to me.
Cool... this sort of thing is far more actionable than "seeming like a cult."
So, next question. Taking you as representative of the group (which is of course not necessarily true, but we start from where we are)... what is your sense of where each of these falls on the spectrum between "this is legitimately worrying; in order to be less at risk for actual bad consequences LW should actually change so as not to do this" on the one hand, and "this is merely superficially worrying; there are probably no real risks here and LW should merely take steps to reassure worriers not to worry about it"?
I'm legitimately worried about the money and the incentives it creates. What would a self-interested agent (LW seems to use "agent" in exactly the opposite sense to what I'd expect it to mean, but I hope I'm clear) in the position of the LW leadership do? My cynical view is: write some papers about how the problems they need to solve are really hard; write enough papers each year to appear to be making progress, and live lives of luxury. So what's stopping them? People in charities that provide far more fuzzies than LW have become disenchanted. People far dumber than Yudkowsky have found rationalizations to live well for themselves on the dime of the charity they run. Corrupt priests of every generation have professed as much faith that performing their actual mission would result in very high future utility, while in fact neglecting those duties for earthly pleasures.
Even if none of the leadership are blowing funds on crack and hookers, if they're all just living ascetically and writing papers, that's actually the same failure mode if they're not being effective at preventing UFAI. When founding the first police force, one of Peel's key principles was that the only way they could be evaluated was the prevalence of crime - not how much work the police were seen to be doing, not how good the public felt about their efforts. It's very hard to find a similar standard with which to hold LW to account.
It occurs to me as I write that I have no idea what the LW funding structure is - whether the site is funded by the CFAR, MIRI, SIAI or something else. Even having all these distinct bodies with mostly the same members smells fishy, seems more likely to be politics than practicalities.
The kookiness... if LW were really more rational than others, I'd expect them to do some weird-but-not-harmful-to-others things. So I suspect this is more a perception than reality thing (Though if there are good answers to "what's the empirical distinction between real and fake cryonics" and "why do you expect polyamory to turn out better for you lot than it did for the '60s hippie communes" it'd be nice to see them). IMO the prime counter would be visible effectiveness. A rich person with some weird habits is an eccentric genius; a poor person with weird habits is just a crank.
It would be really nice to have more verifiable results that say LW-style rationality is good for people (or to know that it isn't and respond accordingly). The failure mode here is that we do a bunch of things that feel good and pat each other on the back and actually it's all placebos. We actually see a fair few articles here claiming that reading LW is bad for you, or that rationality doesn't make people happier. On thinking it through this would be the kind of cult that's basically harmless, so I'm not too concerned. On the perception side, IMO discussing health is not worth the damage it does to the way the community is seen (the first weight-loss thread I saw caused a palpable drop in my confidence in the site). I've no idea how to practically move away from doing so though.
Secrets and bans rub me very strongly the wrong way, and seem likely to damage our efforts in nonobvious ways (to put it another way, secretive organizations tend to become ineffective at their original aims, and I'm worried about this failure mode). I certainly don't think the ban on the basilisk is effective at its purported aim, given that it's still talked about on the internet. And just having this kind of deception around immediately sets off a whole chain of other doubts - what if it's banned for other reasons? What else is banned?
If there really is a need for these bans, there should be a clear set of rules and some kind of review. That would certainly address the perception, and hopefully the actuality too.
I think the use of fictional evidence is actually dangerous. Given the apparently high value of LW-memetic fiction in recruiting, I don't know where the balance is. I think overuse of jargon is just a perceptual problem (though probably worth addressing).
I have... unusual views on diversity, so I don't think setting people against their less-rational friends is an actual problem (in the sense of being damaging to the organization's aims); I file this as a perceptual problem. The most obvious counter I can think of is more politeness about common popular misbeliefs, and less condescension when correcting each other. But I suspect these are problems inherent to internet fora (which doesn't mean they're not real; I would suggest that e.g. reddit has a (minor) cultish aspect to it, one that's offputting to participation. But there may not be any counter).
The hierarchy: in the short term it's merely annoying, but long-term I worry about committee politics. If some of the higher-ups fell out in private (and given that several of them appear to be dating each other that seems likely) and began sniping at each other in the course of their duties, and catching innocent users in the crossfire... I've seen that happen in similar organizations and be very damaging. Actual concern.
So in summary: actual concern: where the money goes, any secrets the organization keeps, clarity of the leadership hierarchy, overuse of fiction. superficial issues: overuse of jargon. The rest of my list is on reflection probably not worth worrying about.