taw comments on How To Lose 100 Karma In 6 Hours -- What Just Happened - Less Wrong

-31 Post author: waitingforgodel 10 December 2010 08:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (214)

You are viewing a single comment's thread. Show more comments above.

Comment author: taw 10 December 2010 12:56:08PM 3 points [-]

I would be surprised if less wrong itself ever developed fully into a cult. I'm not so sure about SIAI, but I guess it will probably just collapse at some point. LW doesn't look like a cult now. But what was Scientology like in its earliest stages?

Is there mostly a single way how groups gradually turn into cults, or does it vary a lot?

My intuition was more about Ayn Rand and objectivists than Scientology, but I don't really know much here. Anybody knows what were early objectivists like?

I didn't put much thought into this, it's just some impressions.

Comment author: jimrandomh 10 December 2010 01:24:06PM *  9 points [-]

Is there mostly a single way how groups gradually turn into cults, or does it vary a lot?

Yes, there is. One of the key features of cults is that they make their members sever all social ties to people outside the cult, so that they lose the safeguard of friends and family who can see what's happening and pull them out if necessary. Sci*****ogy was doing that from the very beginning, and Less Wrong has never done anything like that.

Comment author: David_Gerard 10 December 2010 02:15:12PM *  10 points [-]

Not all, just enough. Weakening their mental ties so they get their social calibration from the small group is the key point. But that's just detail, you've nailed the biggie. Good one.

and Less Wrong has never done anything like that.

SIAI staff will have learnt to think in ways that are hard to calibrate against the outside world (singularitarian ideas, home-brewed decision theories). Also, they're working on a project they think is really important. Also, they have information they can't tell everyone (e.g. things they consider decision-theoretic basilisks). So there's a few untoward forces there. As I said, hope they all have their wits about them.

/me makes mental note to reread piles of stuff on Scientology. I wonder who would be a good consulting expert, i.e. more than me.

Comment author: Anonymous6004 10 December 2010 03:25:21PM 5 points [-]

Not all, just enough. Weakening their mental ties so they get their social calibration from the small group is the key point.

No, it's much more than that. Scientology makes its members cut off communication with their former friends and families entirely. They also have a ritualized training procedure in which an examiner repeatedly tries to provoke them, and they have to avoid producing a detectable response on an "e-meter" (which measures stress response). After doing this for awhile, they learn to remain calm under the most extreme circumstances and not react. And so when Scientology's leaders abuse them in terrible ways and commit horrible crimes, they continue to remain calm and not react.

Cults tear down members' defenses and smash their moral compasses. Less Wrong does the exact opposite.

Comment author: Vaniver 10 December 2010 05:57:14PM *  4 points [-]

Cults tear down members' defenses and smash their moral compasses. Less Wrong does the exact opposite.

What defense against EY does EY strengthen? Because I'm somewhat surprised by the amount I hear Aumann's Agreement Theorem bandied around with regards to what is clearly a mistake on EY's part.

Comment author: David_Gerard 10 December 2010 03:58:26PM *  4 points [-]

I was talking generally, not about Scientology in particular.

As I noted, Scientology is such a toweringly bad idea that it makes other bad ideas seem relatively benign. There are lots of cultish groups that are nowhere near as bad as Scientology, but that doesn't make them just fine. Beware of this error. (Useful way to avoid it: don't use Scientology as a comparison in your reasoning.)

Comment author: TheOtherDave 10 December 2010 04:07:42PM 3 points [-]

But that error isn't nearly as bad as accidentally violating containment procedures when handling virulent pathogens, so really, what is there to worry about?

(ducks)

Comment author: David_Gerard 10 December 2010 04:08:57PM 1 point [-]

The forbidden topic, obviously.

Comment author: taw 10 December 2010 04:44:53PM 3 points [-]

No, it's much more than that. Scientology makes its members cut off communication with their former friends and families entirely.

I'd like to see some solid evidence for or against the claim that typical developing cults make their members cut off communication with their former friends and families entirely.

If the claim is of merely weakening these ties, then this is definitely happening. I especially mean commitment by signing up for cryonics. It will definitely increase mental distance between affected person and their formerly close friends and family, I guess about as much signing up for a weird religion but mostly perceived as benign would. I doubt anyone has much evidence about this demographics?

Comment author: David_Gerard 10 December 2010 05:06:35PM *  5 points [-]

I'd like to see some solid evidence for or against the claim that typical developing cults make their members cut off communication with their former friends and families entirely.

I don't think they necessarily make them - all that's needed is for the person to loosen the ties in their head, and strengthen them to the group.

An example is terrorist cells, which are small groups with a goal who have gone weird together. They may not cut themselves off from their families, but their bad idea has them enough that their social calibrator goes group-focused. I suspect this is part of why people who decompartmentalise toxic waste go funny. (I haven't worked out precisely how to get from the first to the second.)

There are small Christian churches that also go cultish in the same way. Note that in this case, the religious ideas are apparently mainstream - but there's enough weird stuff in the Blble to justify all manner of strangeness.

At some stage cohesion of the group becomes very important, possibly more important than the supposed point of the group. (I'm not sure how to measure that.)

I need to ask some people about this. Unfortunately, the real experts on cult thinking include several of the people currently going wildly idiotic about cryonics on the Rick Ross boards ... an example of overtraining on a bad experience and seeing a pattern where it isn't.

Comment author: taw 10 December 2010 07:42:57PM 6 points [-]

Regardless of actual chances of both working and considering the issue from purely sociological perspective - signing up for cryonics seems to be to be a lot like "accepting Jesus" / born again / or joining some far-more-religious-than-average subgroups of mainstream religions.

In both situations there's some underlying reasonably mainstream meme soup that is more or less accepted (Christianity / strict mind-brain correspondence) but which most people who accept it compartmentalize away. Then some groups decide not to compartmentalize it but accept consequences of their beliefs. It really doesn't take much more than that.

Disclaimers:

I'm probably in some top 25 posters by karma, but I tend to feel like an outsider here a lot.

The only "rationalist" idea from LW canon I take more or less seriously is the outside view, and the outside view says taking ideas too seriously tends to have horrible consequences most of the time. So I cannot even take outside view too seriously, by outside view - and indeed I have totally violated outside view's conclusions on several occasions, after careful consideration and fully aware of what I'm doing. Maybe I should write about it someday.

In my estimate all FAI / AI foom / nonstandard decision theories stuff is nothing but severe compartmentalization failure.

In my estimate cryonics will probably be feasible in some remote future, but right now costs of cryonics (very rarely honestly stated by proponents, backed by serious economic simulations instead of wishful thinking) are far too high and chances of it working now are far too slim to bother. I wouldn't even take it for free, as it would interfere with me being an organ donor, and that has non-negligible value for me. And even without that personal cost of added weirdness would probably be too high relative to my estimate of it working.

I can imagine alternative universes where cryonics makes sense, and I don't think people who take cryonics seriously are insane, I just think wishful thinking biases them. In non-zero but as far as I can tell very very tiny portion of possible future universes where cryonics turned out to work, well, enjoy your second life.

By the way, is there any reason for me to write articles expanding my points, or not really?

Comment author: multifoliaterose 10 December 2010 09:03:30PM *  3 points [-]

I'm probably in some top 25 posters by karma, but I tend to feel like an outsider here a lot.

My own situation is not so different although

(a) I have lower karma than you and

(b) There are some LW posters with whom I feel strong affinity

By the way, is there any reason for me to write articles expanding my points, or not really?

I myself am curious and would read what you had to say with interest and this is a weak indication that others would but of course it's for you to say whether it would be worth the opportunity cost. Probably the community would be more receptive to such pieces if they were cautious & carefully argued than if not; but this takes still more time and effort.

Comment author: taw 11 December 2010 12:09:28AM 9 points [-]

(a) I have lower karma than you

You get karma mostly for contributing more, not by higher quality. Posts and comments both have positive expected karma.

Also you get more karma for more alignment with groupthink. I even recall how in early days of lesswrong I stated based on very solid outside view evidence (from every single subreddit I've been to) that karma and reception will come to correlate with not only quality but also alignment with groupthink - that on reddit-style karma system downvoting-as-disagreement / upvoting-as-agreement becomes very significant at some point. People disagreed, but the outside view prevailed.

This unfortunately means that one needs to put a lot more effort into writing something that disagrees with groupthink than something that agrees with it - and such trivial inconveniences matter.

(b) There are some LW posters with whom I feel strong affinity

I don't think I feel particular "affinity" with anyone here, but I find many posters highly enjoyable to read and/or having a lot of insightful ideas.

I mostly write when I disagree with someone, so for a change (I don't hate everyone all the time, honestly :-p) here are two among the best writings by lesswrong posters I've ever read:

Comment author: David_Gerard 11 December 2010 01:51:15AM *  4 points [-]

I think it's a plus point that a contrarian comment will get upvotes for effort and showing its work (links, etc) - that is, the moderation method still seems to be "More like this please" than "Like". Being right and obnoxious gets downvotes.

(I think "Vote up" and "Vote down" might be profitably be replaced with "More like this" and "Less like this", but I don't think that's needed now and I doubt it'd work if it was needed.)

Comment author: multifoliaterose 11 December 2010 01:19:12AM *  2 points [-]

You get karma mostly for contributing more, not by higher quality. Posts and comments both have positive expected karma.

Yes, I've noticed this; it seems like there's a danger of there being an illusion that one is that one is actually getting something done by posting or commenting on LW on account of collecting karma by default.

On the upside I think that the net value of LW is positive so that (taking the outside view; ignoring the quality of particular posts/comments which is highly variable), the expected value of posts and comments is positive though probably less than one subjectively feels.

Also you get more karma for more alignment with groupthink [...]

Yes; I've noticed this too. A few months ago I came across Robin Hanson's Most Rationalists Are Elsewhere which is in similar spirit.

This unfortunately means that one needs to put a lot more effort into writing something that disagrees with groupthink than something that agrees with it - and such trivial inconveniences matter.

Agree here. In defense of LW I would say that this seems like a pretty generic feature across groups in general. I myself try to be careful about interpreting statements made by those with views that clash with my own charitably but don't know how well I succeed.

I mostly write when I disagree with someone, so for a change (I don't hate everyone all the time, honestly :-p)

Good to know :-)

Twilight fanfiction by Alicorn - it is ridiculously good, I guess a lot of people will avoid it because it's Twilight, but it would be a horrible mistake.

Fascinating; I had avoided it for this very reason but will plan on checking it out.

Contrarian excuses by Robin Hanson

Great article! I hadn't seen it before.

Comment author: Alicorn 11 December 2010 12:22:57AM 2 points [-]

it is ridiculously good

Thank you! :D

Comment author: [deleted] 10 December 2010 10:21:40PM 1 point [-]

By the way, is there any reason for me to write articles expanding my points, or not really?

I'm just some random lurker, but I'd be very interested in these articles. I share your view on cryonics and would like to read some more clarification on what you mean by "compartmentalization failure" and some examples of a rejection of the outside view.

Comment author: taw 10 December 2010 11:41:09PM 8 points [-]

Here's my view of current lesswrong situation.

On compartmentalization failure and related issues there are two schools present on less wrong:

Right now there doesn't seem to be any hope of reaching Aumann agreement between these points of view, and at least some members of both camps view many of other camp's ideas with contempt. The primary reason seems to be that the kind of arguments that people on one end of the spectrum find convincing people on the other end see as total nonsense, and with full reciprocity.

Of course there's plenty of issues on which both views agree as well - like religion, evolution, akrasia, proper approach to statistics, and various biases (I think outside viewers seem to demand more evidence that these are also a problem outside laboratory than inside viewers, but it's not a huge disagreement). And many other disagreements seem to be unrelated to this.

Is this outside-viewers/pro-compartmentalization/firm-rooting-in-experience/caution vs weak-inside-viewers/anti-compartmentalization/pure-reason/taking-ideas-seriously spectrum only my impression, or do other people see it this way as well?

I might be very well biased, as I feel very strongly about this issue, and the most prominent poster Eliezer seems to feel very strongly about this in exactly the opposite way. It seems to me that most people here have reasonably well defined position on this issue - but I know better than to trust my impressions of people on an internet forum.

And second question - can you think of any good way for people holding these two positions to reach Aumann agreement?

As for cryonics it's a lot of number crunching, textbook economics, outside view arguments etc. - all leading to very very low numbers. I might do that someday if I'm really bored.

Comment author: Will_Newsome 12 December 2010 01:04:50AM 2 points [-]

Phrasing it as pro-compartmentalization might cause unnecessary negative affect for a lot of aspiring rationalists here at LW, though I'm too exhausted to imagine a good alternative. (Just in case you were planning on writing a post about this or the like. Also, Anna Salamon's posts on compartmentalization were significantly better than my own.)

Comment author: [deleted] 11 December 2010 05:16:31AM 2 points [-]

Thanks, I now understand what you mean. I'll have to think further about this.

Personally, I find myself strongly drawn to the anti-compartmentalization position. However, I had bad enough problems with it (let's just say I'm exactly the kind of person that becomes a fundamentalist, given the right environment) that I appreciate an outside view and want to adopt it a lot more. Making my underlying assumptions and motivations explicit and demanding the same level of proof and consistency of them that I demand from some belief has served me well - so far anyway.

Also, I'd have to admit that I enjoy reading disagreements most, even if just for disagreement's sake, so I'm not sure I actually want to see Aumann agreement. "Someone is wrong on the internet" syndrome has, on average, motivated me more than reasonable arguments, I'm afraid.

Comment author: multifoliaterose 11 December 2010 04:09:27AM *  2 points [-]

I know what you're talking about here.

Comment author: David_Gerard 11 December 2010 01:20:49AM *  1 point [-]

And second question - can you think of any good way for people holding these two positions to reach Aumann agreement?

Sure: compartmentalisation is clearly an intellectual sin - reality is all one piece - but we're running on corrupt hardware so due caution applies.

That's my view after a couple of months' thought. Does that work for you?

(And that sums up about 2000 semi-readable words of inchoate notes on tne subject. (ctrl-C ctrl-V))

In the present Headless Chicken Mode, by the way, Eliezer is specifically suggesting compartmentalising the very bad idea, having seen people burnt by it. There's nothing quite like experience to help one appreciate the plus points of compartmentalisation. It's still an intellectual sin, though.

Comment author: taw 10 December 2010 04:36:57PM *  1 point [-]

Scientology was doing that from the very beginning

Quick reading suggests that Hubbard first founded "dianetics" in late 1949/early 1950, and it became "scientology" only in late 1953/early 1954. As far as I can tell it took them many years until they became Scientology we know. There's some evidence of evaporative cooling at that stage.

And just as David Gerard says, modern Scientology is extreme case. By cult I meant something more like objectivists.

Comment author: David_Gerard 10 December 2010 04:55:27PM *  6 points [-]

The Wikipedia articles on Scientology are pretty good, by the way. (If I say so myself. I started WikiProject Scientology :-) Mostly started by critics but with lots of input from Scientologists, and the Neutral Point Of View turns out to be a fantastically effective way of writing about the stuff - before Wikipedia, there were CoS sites which were friendly and pleasant but rather glaringly incomplete in important ways, and critics' sites which were highly informative but frequently so bitter as to be all but unreadable.

(Despite the key rule of NPOV - write for your opponent - I doubt the CoS is a fan of WP's Scientology articles. Ah well!)

Comment author: David_Gerard 10 December 2010 01:18:25PM *  15 points [-]

I don't have a quick comment-length intro to how cults work. Every Cause Wants To Be A Cult will give you some idea.

Humans have a natural tendency to form close-knit ingroups. This can turn into the cult attractor. If the group starts going a bit weird, evaporative cooling makes it weirder. edit: jimrandomh nailed it: it's isolation from outside social calibration that lets a group go weird.

Predatory infectious memes are mostly not constructed, they evolve. Hence the cult attractor.

Scientology was actually constructed - Hubbard had a keen understanding of human psychology (and no moral compass and no concern as to the difference between truth and falsity, but anyway) and stitched it together entirely from existing components. He started with Dianetics and then he bolted more stuff onto it as he went.

But talking about Scientology is actually not helpful for the question you're asking, because Scientology is the Godwin example of bad infectious memes - it's so bad (one of the most damaging, in terms of how long it takes ex-members to recover - I couldn't quickly find the cite) that it makes lesser nasty cults look really quite benign by comparison. It is literally as if your only example of authoritarianism was Hitler or Pol Pot and casual authoritarianism didn't look that damaging at all compared to that.

Ayn Rand's group turned cultish by evaporative cooling. These days, it's in practice more a case of individual sufferers of memetic infection - someone reads Atlas Shrugged and turns into an annoying crank. It's an example of how impossible it is to talk someone out of a memetic infection that turns them into a crank - they have to get themselves out of it.

Is this helpful?