Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Notes on Brainwashing & 'Cults'

35 Post author: gwern 13 September 2013 08:49PM

“Brainwashing”, as popularly understood, does not exist or is of almost zero effectiveness. The belief stems from American panic over Communism post-Korean War combined with fear of new religions and sensationalized incidents; in practice, “cults” have retention rates in the single percentage point range and ceased to be an issue decades ago. Typically, a conversion sticks because an organization provides value to its members.

Some old SIAI work of mine. Researching this was very difficult because the relevant religious studies area, while apparently completely repudiating most public beliefs about the subject (eg. the effectiveness of brainwashing, how damaging cults are, how large they are, whether that’s even a meaningful category which can be distinguished from mainstream religions rather than a hidden inference - a claim, I will note, which is much more plausible when you consider how abusive Scientology is to its members as compared to how abusive the Catholic Church has been etc), prefer to publish their research in book form, which makes it very hard to review any of it. Some of the key citation were papers - but the cult panic was so long ago that most of them are not online or have been digitized! I recently added some cites and realized I had not touched the draft in a year; so while this collection of notes is not really up to my preferred standards, I’m simply posting it for what it’s worth. (One lesson to take away from this is that controlling uploaded human brains will not be nearly as simple & easy as applying classic ‘brainwashing’ strategies - because those don’t actually work.)

Reading through the literature and especially the law review articles (courts flirted disconcertingly much with licensing kidnapping and abandoning free speech), I was reminded very heavily - and not in a good way - of the War on Terror.

Old American POW studies:

  • Clark et al 1981 Destructive Cult Conversion: Theory, Research and Practice
  • Lifton 1961 Thought Reform and the Psychology of Totalism
  • Ross & Langone 1988 Cults: What Parents Should Know
  • Schein, Schneier & Barker 1961 Coercive Persuasion
  • Singer 1978, 1979 “Therapy with Ex-cult Members” Journal of the National Association of Private Psychiatric Hospitals; “Coming Out of the Cults”, Psychology Today

Started the myth of effective brain-washing. But in practice, cult attrition rates are very high! (As makes sense: if cults did not have high attrition rates, they would long ago have dominated the world due to exponential growth.) This attrition claim is made all over the literature, with some example citations being:

  • Barker 1984, 1987 The Making of a Moonie: Choice of Brainwashing?; “Quo Vadis? The Unification Church”, pg141-152, The Future of New Religious Movements
  • Beckford 1981 “Conversion and Apostasy: Antithesis or Complementarity?”
  • Bird & Reimer 1982 “Participation rates in new religious movements and para-religious movements”
  • Robbins 1988 Cults, Converts and Charisma
  • Shupe & Bromley 1980 The New Vigilantes: Deprogrammers, Anticultists and the New Religions
  • Wright & Piper 1986 “Families and Cults: Familial Factors Related to Youth Leaving or Remaining in Deviant Religious Groups”
  • Wright 1983, 1987, 1988 “Defection from New Religious Movements: A Test of Some Theoretical Propositions” pg106-121 The Brainwashing/Deprogramming Controversy; Leaving Cults: The Dynamics of Defection; “Leaving New Religious Movements: Issues, Theory and Research”, pg143-165 Falling from the Faith: Causes and Consequences of Religious Apostasy
  • Wikipedia cites The Handbook of Cults and Sects in America, Hadden, J and Bromley, D eds. (1993)
  • a back of the envelope estimate for Scientology by Steve Plakos in 2000:

    In absolute numbers, that is from 8 million exposed to 150k active current, it means they’ve lost 7,850,000 bodies in the shop. That equates to a Retention Rate of 1.875%. Now, to be fair, over the course of 50 years “X” number of scientologists have dropped their bodies and gone off to Mars, etc,. who might still be members today if they weren’t dead We do not know what the mortality rate is for Scientologists. To significantly impact the RR, there would have to have been a 100% turn over in active membership due to generational shifting. There is no evidence that 150,000 active members of the CofS have died over the past 50 years. Beyond that, we would also need to apply the RR to deceased members to see what number would have continued beyond 15 years. Therefore, using the most favorable membership numbers and not discounting for lose of membership beyond the 15th year, we see a RR of 1.875%+“X”. If we assume that generational shifting accounts for a 10% turnover amongst current membership, that is, that the current membership would be 10% greater had members survived, X would equal 15,000 dead members, or, a total Retained Membership of 165,000. That would give the CofS a 50 year Retention Rate of 2.0625%.

Iannaccone 2003, “The Market for Martyrs” (quasi-review)

From the late-1960s through the mid-1980s, sociologists devoted immense energy to the study of New Religious Movements. [For overviews of the literature, see Bromley (1987), Robbins (1988), and Stark (1985).] They did so in part because NRM growth directly contradicted their traditional theories of secularization, not to mention the sensational mid-sixties claims God was “dead” (Cox 1966; Murchland 1967). NRM’s also were ideal subjects for case stud ies, on account of their small size, brief histories, distinctive practices, charismatic leaders, devoted members, and rapid evolution. But above all, the NRM’s attracted attention because they scared people.

…We have trouble recalling the fear provoked by groups like the Krishnas, Moonies, and Rajneeshees. Their years of explosive growth are long past, and many of their “strange” ideas have become staples of popular culture. [We see this influence not only in today’s New Age and Neo-Pagan movements, but also in novels, music, movies, TV shows, video games, university courses, environmentalism, respect for “cultural diversity,” and the intellectual elite’s broad critique of Christian culture.] But they looked far more threatening in the seventies and eighties, especially after November 18, 1978. On that day, the Reverend Jim Jones, founder of the People’s Temple, ordered the murder of a U.S. Congressman followed by the mass murder/suicide of 913 members of his cult, including nearly 300 children.

The “cults” aggressively proselytized and solicited on sidewalks, airports, and shopping centers all over America. They recruited young adults to the dismay of their parents. Their leaders promoted bizarre beliefs, dress, and diet. Their members often lived communally, devoted their time and money to the group, and adopted highly deviant lifestyles. Cults were accused of gaining converts via deception and coercion; funding themselves through illegal activities; preying upon people the young, alienated, or mentally unstable ; luring members into strange sexual liaisons; and using force, drugs, or threats to deter the exit of disillusioned members. The accusations were elaborated in books, magazine articles, newspaper accounts, and TV drama. By the late-1970s, public concern and media hype had given birth to anti-cult organizations, anti-cult legislation, and anti-cult judicial rulings. The public, the media, many psychologists, and the courts largely accepted the claim that cults could “brainwash” their members, thereby rendering them incapable of rational choice, including the choice to leave. [Parents hired private investigators to literally kidnap their adult children and subject them to days of highly-coercive “deprogramming.” Courts often agreed that these violations of normal constitutional rights were justified, given the victim’s presumed inability to think and act rationally (Anthony 1990; Anthony and Robbins 1992; Bromley 1983; Richardson 1991; Robbins 1985).]

We now know that nearly all the anti-cult claims were overblown, mistaken, or outright lies. Americans no longer obsess about Scientology, Transcendental Meditation, or the Children of God. But a large body of research remains. It witnesses to the ease with which the public, media, policy-makers, and even academics accept irrationality as an explanation for behavior that is new, strange, and (apparently or actually) dangerous.

…As the case stud ies piled up, it became apparent that both the media stereotypes (of sleep-deprived, sugar-hyped, brainwashed automatons) and academic theories (of alienated, authoritarian, neurotics) were far off mark. Most cult converts were children of privilege raised by educated parents in suburban homes. Young, healthy, intelligent, and college educated, they could look forward to solid careers and comfortable incomes. [Rodney Stark (2002) has recently shown that an analogous result holds for Medieval saints - arguably the most dedicated “cult converts” of their day.]

Psychologists searched in vain for a prevalence of “authoritarian personalities,” neurotic fears, repressed anger, high anxiety, religious obsession, personality disorders, deviant needs, and other mental pathologies. The y likewise failed to find alienation, strained relationships, and poor social skills. In nearly all respects - economically, socially, psychologically - the typical cult converts tested out normal. Moreover, nearly all those who left cults after weeks, months, or even years of membership showed no sign of physical, mental, or social harm. Normal background and circumstances, normal personalities and relationships, and a normal subsequent life - this was the “profile” of the typical cultist.

…Numerous studies of cult recruitment, conversion, and retention found no evidence of “brainwashing.” The Moonies and other new religious movements did indeed devote tremendous energy to outreach and persuasion, but they employed conventional methods and enjoyed very limited success. In the most comprehensive study to date, Eileen Barker (1984) could find no evidence that Moonie recruits were ever kidnapped, confined, or coerced (though it was true that some anti-cult “deprogrammers” kidnapped and restrained converts so as to “rescue” them from the movement). Seminar participants were not deprived of sleep; the food was “no worse than that in most college residences;” the lectures were “no more trance-inducing than those given everyday” at many colleges; and there was very little chanting, no drugs or alcohol, and little that could be termed “frenzy” or “ecstatic” experience (Barker 1984). People were free to leave, and leave they did - in droves.

Barker’s comprehensive enumeration showed that among the relatively modest number of recruits who went so far as to attend two-day retreats (claimed to be Moonies’ most effective means of “brainwashing”), fewer than 25% joined the group for more than a week, and only 5% remained full-time members 1 year later. Among the larger numbers who visited a Moonie centre, not 1 in 200 remained in the movement 2 years later. With failure rates exceeding 99.5%, it comes as no surprise that full-time Moonie membership in the U.S. never exceeded a few thousand. And this was one of the most successful cults of the era! Once researchers began checking, rather than simply repeating the numbers claimed by the groups, defectors, or journalists, they discovered dismal retention rates in nearly all groups. [For more on the prevalence and process of cult defection, see Wight (1987) and Bromley (1988).] By the mid-1980s, researchers had so thoroughly discredited “brainwashing” theories that both the Society for the Scientific Study of Religion and the American Sociological Association agreed to add their names to an amicus brief denouncing the theory in court (Richardson 1985).

Singer in particular has been heavily criticized; “Cult/Brainwashing Cases and Freedom of Religion”, Richardson 1991:

Dr. Singer is a clinical psychologist in private practice who earns a considerable portion of her income from cult cases. She has been an adjunct professor at the University of California at Berkeley, but has never held a paid or tenured-track position there. See H. Newton Malony, “Anticultism: The Ethics of Psychologists’ Reactions to New Religions,” presented at annual meeting of the American Psychological Association (New York, 1987) and Anthony, “Evaluating Key Testimony” for more details on Singer’s career.

…The [amicus curiae] brief further claimed that Singer misrepresents the tradition of research out of which terms like “thought reform” and “coercive persuasion” come. She ignores the fact that these earlier studies focus on physical coercion and fear as motivators, and that even when using such tactics the earlier efforts were not very successful. With great facility, Singer moves quickly from situations of physical force to those where none is applied, claiming that these “‘second generation’” thought reform techniques using affection are actually more effective than the use of force in brainwashing people to become members. Thus, Singer is criticized for claiming to stand squarely on the tradition of research developed by scholars such as Edgar Schein and Robert Lifton, while she shifts the entire focus to non-coercive situations quite unlike those encountered in Communist China or Korean prisoner of war camps. The brief points out, as well, that Singer ignores a vast amount of research supporting the conclusion that virtually all who participate in the new religions do so voluntarily, and for easily understandable reasons. No magical “black box” of brainwashing is needed to explain why significant numbers of young people chose, in the 1960s and 1970s, to abandon their place in society and experiment with alternative life styles and beliefs. Many youth were leaving lifestyles that they felt were hypocritical, and experimenting with other ways of life that they found to be more fulfilling, at least temporarily. Particularly noteworthy, but ignored by Singer, are the extremely high attrition rates of all the new religions. These groups are actually very small in numbers (the Hare Krishna and the Unification Church each have no more than two to three thousand members nationwide), which puts the lie to brainwashing claims. If “brainwashing” practiced by new religions is so powerful, why are the groups experiencing so much voluntary attrition, and why are they so small?

…Considerable research reported in refereed scholarly journals and other sources supports the idea that the new religions may be serving an important ameliorative function for American society. The groups may be functioning as “half-way houses” for many youth who have withdrawn from society, but still need a place to be until they decide to “return home.” Participation in some new religions has been shown to have demonstrable positive effects on the psychological functioning of individuals, a finding that Singer refuses to acknowledge.

“Overcoming The Bondage Of Victimization: A Critical Evaluation of Cult Mind Control Theories”, Bob and Gretchen Passantino Cornerstone Magazine 1994:

Neither brainwashing, mind control’s supposed precursor, nor mind control itself, have any appreciable demonstrated effectiveness. Singer and other mind control model proponents are not always candid about this fact: The early brainwashing attempts were largely unsuccessful. Even though the Koreans and Chinese used extreme forms of physical coercion as well as persuasive coercion, very few individuals subjected to their techniques changed their basic world views or commitments. The CIA also experimented with brainwashing. Though not using Korean or Chinese techniques of torture, beatings, and group dynamics, the CIA did experiment with drugs (including LSD) and medical therapies such as electroshock in their research on mind control. Their experiments failed to produce even one potential Manchurian Candidate, and the program was finally abandoned.

Although some mind control model advocates bring up studies that appear to provide objective data in support of their theories, such is not the case. These studies are generally flawed in several areas: (1) Frequently the respondents are not from a wide cross-section of ex-members but disproportionately are those who have been exit-counseled by mind control model advocates who tell them they were under mind control; (2) Frequently the sample group is so small its results cannot be fairly representative of cult membership in general; (3) It is almost impossible to gather data from the same individuals before cult affiliation, during cult affiliation, and after cult disaffection, so respondents are sometimes asked to answer as though they were not yet members, or as though they were still members, etc. Each of these flaws introduces unpredicatiblity and subjectivity that make such study results unreliable…The evidence against the effectiveness of mind control techniques is even more overwhelming. Studies show that the vast majority of young people approached by new religious movements (NRMs) never join despite heavy recruitment tactics. This low rate of recruitment provides ample evidence that whatever techniques of purported mind control are used as cult recruiting tools, they do not work on most people. Even of those interested enough to attend a recruitment seminar or weekend, the majority do not join the group. Eileen Barker documents [Barker, Eileen. New Religious Movements: A Practical Introduction. London: Her Majesty’s Stationery Office, 1989.] that out of 1000 people persuaded by the Moonies to attend one of their overnight programs in 1979, 90% had no further involvement. Only 8% joined for more than one week, and less than 4% remained members in 1981, two years later:

. . . and, with the passage of time, the number of continuing members who joined in 1979 has continued to fall. If the calculation were to start from those who, for one reason or another, had visited one of the movement’s centres in 1979, at least 999 out of every 1,000 of those people had, by the mid-1980s, succeeded in resisting the persuasive techniques of the Unification Church.

Of particular importance is that this extremely low rate of conversion is known even to Hassan, the best-known mind control model advocate whose book [Hassan, Steven. Combatting Cult Mind Control. Rochester, VT: Park Street Press, 1990?] is the standard text for introducing concerned parents to mind control/exit counseling. In his personal testimony of his own involvement with the Unification Church, he notes that he was the first convert to join at the center in Queens; that during the first three months of his membership he only recruited two more people; and that pressure to recruit new members was only to reach the goal of one new person per member per month, a surprisingly low figure if we are to accept the inevitable success of cult mind control techniques.

Objection: High Attrition Rates Additionally, natural attrition (people leaving the group without specific intervention) was much higher than the self-claimed 65% deprogramming success figure! It is far more likely a new convert would leave the cult within the first year of his membership than it is that he would become a long term member.

Gomes, Unmasking the Cults (Wikipedia quote):

While advocates of the deprogramming position have claimed high rates of success, studies show that natural attrition rates actually are higher than the success rate achieved through deprogramming

“Psychological Manipulation and Society”, book review of Spying in Guruland: Inside Britain’s Cults, Shaw 1994

Eventually Shaw quit the Emin group. Two months later he checked in with some Emin members at the Healing Arts Festival, a psychic fair. He avoided many Emin phone invitations for him to attend another meeting. He discovered that most, if not all, of the people who joined with him had dropped out. This is consistent with what Shaw has noted about most cults and recruits: the dropout rate is high.

Anthony & Robbins 1992, “Law, Social Science and the ‘Brainwashing’ Exception to the First Amendment”:

Lifton and Schein are also characterized in Molko (54) as attesting to the effectiveness of brainwashing, although Schein, an expert on Chinese coercive persuasion of Korean War POWs, actually thought, as do a number of scholars, that the Chinese program was relatively ineffective (Schein, 1959, p. 332; see also Anthony, 1990a; Schefiin & Opton, 1978)…Schein appears to actually have considered the communist Chinese program to be a relative “failure” at least, “considering the effort devoted to it” (Schein, 1959, p. 332; Anthony, 1990a, p. 302)…Various clinical and psychometric studies of devotees of well-known “cults” (Ross, 1983; Ungerleider & Wellisch, 1979) have found little or no personality disorder or cognitive impairment.

  • Ross 1983. “Clinical profile of Hare Krishna devotees”, American Journal of Psychiatry
  • Schein, E. (1959). “Brainwashing and totalitarianization in modern society”. World Politics, 2,430441.
  • Ungerleider, T., & Wellisch, D. K (1979). “Coercive persuasion (brainwashing), religious cults, and deprogramming”. American Journal of Psychiatry , 136,3,279-82.

“Brainwashed! Scholars of cults accuse each other of bad faith”, by Charlotte Allen, Lingua Franca Dec/Jan 1998:

Zablocki’s conversion to brainwashing theory may sound like common sense to a public brought up on TV images of zombielike cultists committing fiendish crimes or on the Chinese mind control experiments dramatized in the 1962 film The Manchurian Candidate. But among social scientists, brainwashing has been a bitterly contested theory for some time. No one doubts that a person can be made to behave in particular ways when he is threatened with physical force (what wouldn’t you do with a gun pressed to your head?), but in the absence of weapons or torture, can a person be manipulated against his will?

Most sociologists and psychologists who study cults think not. For starters, brainwashing isn’t, as Zablocki himself admits, “a process that is directly observable.” And even if brainwashing could be isolated and measured in a clinical trial, ethical objections make conducting such a test almost unthinkable. (What sort of waivers would you have to sign before allowing yourself to be brainwashed?) In the last decade, while brainwashing has enjoyed a high profile in the media-invoked to explain sensational cult disasters from the mass suicide of Heaven’s Gate members to the twelve sarin deaths on the Tokyo subway attributed to the Aum Shinrikyo cult-social scientists have shunned the the term as a symptom of Cold War paranoia and anticult hysteria. Instead, they favor more benign explanations of cult membership. Alternatives include “labeling” theory, which argues there is simply nothing sinister about alternative religions, that the problem is one of prejudicial labeling on the part of a mainstream culture that sees cult members as brainwashed dupes, and “preexisting condition” theory, which posits that cult members are people who are mentally ill or otherwise maladjusted before they join. (A couple of scholars have even proposed malnutrition as a preexisting condition, arguing that calcium deficiency may make people prone to charismatic susceptibility.)

Thus, when Zablocki published an indignant 2-part, 60-page defense of brainwashing theory in the October 1997 and April 1998 issues of Nova Religio, a scholarly journal devoted to alternative belief systems, he ignited a furor in the field. Pointing to the “high exit costs” that some cults exacted from those who tried to defect-shunning, forfeiture of parental rights and property, and veiled threats-Zablocki argued that these were indications of brainwashing, signs that some groups were using psychological coercion to maintain total control over their members. Although he admitted he could not prove brainwashing empirically, he argued that at the very least brainwashing should not be dismissed out of hand.

…Zablocki’s colleagues were unimpressed. In a response also published in Nova Religio, David Bromley, a sociologist at Virginia Commonwealth University who has studied the Reverend Sun Myung Moon’s Unification Church, complained that in Zablocki’s formulation brainwashing remained a vague, slippery, limiting, and ultimately untestable concept. Moreover, he pointed out, cults typically have low recruitment success, high turnover rates (recruits typically leave after a few months, and hardly anyone lasts longer than two years), and short life spans, all grounds for serious skepticism about the brainwashing hypothesis. Even if you overlook these facts, Bromley added, “the extraordinarily varied cultural origins, patterns of organizational development, and leadership styles of such groups pose a problem in explaining how they seem to have discovered the same ‘brainwashing’ psycho-technology at almost precisely the same historical moment.” A quick survey of the field reveals that Bromley is far from being the only doubter. Eileen Barker, a sociologist at the London School of Economics who has also studied the Unification Church, says, “People regularly leave the Moonies of their own free will. The cults are actually less efficient at retaining their members than other social groups. They put a lot of pressure on them to stay in-love-bombing, guilt trips-but it doesn’t work. They’d like to brainwash them, but they can’t.”

…To further complicate matters, researchers often bring very different, even conflicting approaches to their work. Psychologists, for example, tend to emphasize how a repeated environmental stimulus can elicit a conditioned response-depriving subjects of their autonomy. Sociologists, by contrast, typically endorse a voluntarist conversion model for religion, which posits that people join cults for generally rational reasons connected to the group’s ability to satisfy their needs: for a transcendent theology; for strong bonds of kinship and solidarity; for enough social support to enable them to quit drugs or otherwise turn their personal lives around. (For example, one study has shown that schizophrenics who joined cults functioned better than those who tried drugs or conventional psychotherapy.)

…In 1980 the New York state legislature, over objections from the American Civil Liberties Union, passed a bill that would have legalized deprogramming (it was vetoed by Governor Hugh Carey). “With deprogramming-with parents having their children abducted and held captive-the whole thing became intensely emotional,” says Thomas Robbins. “Who were the kidnappers: the parents, the cults, or the police? There were hard feelings on both sides.” Among the most outraged were social scientists who had never believed that people could be brainwashed into joining cults and who, as good civil libertarians, were appalled by deprogramming. Ofshe and Singer’s scholarly testimony (and fat fees) distressed a number of these scholars, whose credentials were equally respectable and whose own research had led them to conclude that coercive persuasion was impossible in the absence of some sort of physical coercion such as prison or torture.

…Zablocki made another, potentially more damning charge, however-one that Robbins did not take up. A significant amount of cult money, he wrote, has gone to scholars-in support of research, publication, conference participation, and other services. Zablocki did not name names. But a number of professors freely admit that nontraditional religions (in most cases, the Unificationists and Scientologists) have cut them checks. The list includes some of the most prominent scholars in the discipline: Bromley, Barker, Rodney Stark of the University of Washington, Jeffrey Hadden of the University of Virginia, and James Richardson, a sociologist of religion at the University of Nevada at Reno. All five have attended cult-subsidized conferences, and Bromley, Hadden, and Richardson have occasionally testified in court on behalf of cults or offered their services as expert witnesses against brainwashing theory. “This is an issue,” Zablocki wrote sternly, “of a whole different ethical magnitude from that of taking research funding from the Methodists to find out why the collection baskets are not coming back as heavy as they used to.”

Comments (102)

Comment author: Viliam_Bur 14 September 2013 10:06:42PM *  10 points [-]

My model of a cult is a mechanism that exploits various flaws in human thinking. For example, peer pressure turned up to eleven: if you leave a cult, you lose all your friends at the same moment. (In real life, one's friends are usually not this coordinated.) The cult keeps you busy with all the cultish stuff, which makes the natural procrastination about important decisions (such as leaving the cult) even stronger. There is the initial "love bombing", which prevents you from estimating how happy you would be if you joined. Etc.

Typically, a conversion sticks because an organization provides value to its members.

Disagree connotationally. (Also I am not sure what "conversion sticks" means precisely. If a person spends 5 or 10 years in a cult and then leaves, do we consider those initial years as a success, because the person did not run away immediately?) Yes, technically, the organization provides something, but if we want to get more specific, it usually provides promises that something very good will happen in the unspecified - but very close - future. It also provides something immediately, for example a social group, but I think those promises are very important for many people. So if we define "value" as promises that sound good but are never fulfilled, then yes, the organization provides the value to its members. But if we define "value" as the thing that was promised, then it does not really provide the value.

Comment author: shminux 13 September 2013 11:52:45PM 12 points [-]

An interesting question is, given the general failure of brainwashing, how do new religions manage to take hold, like Christianity, Islam, Mormonism, Sikhism, etc.? How come Christian and Islamic proselytism has been so consistently successful in many parts of the world?

Comment author: gwern 14 September 2013 12:22:40AM 20 points [-]

These are good questions, and as you can imagine, much debated in the literature, with explanations ranging from acts of God (the paradigmatic example being the Jew quoted in Acts of the Apostles arguing that Christianity didn't need to be suppressed because if it flourished, it must be favored by God, and it would fail if it was disfavored by him) to enabling effective society coordination (particularly attractive for Islam: the horsebacked nomads managed to coordinate under Mohammed rather than feud, and did as well as the Mongols, with conversion then following from personal advantage and to escape dhimmitude) to arguments that it's just random drift (Carrier points out that the best estimates of the sizes of early Christianity as tiny even centuries after Jesus then necessarily imply that the annual growth rate must have been far tinier than commonly assumed).

Comment author: jimmy 14 September 2013 06:32:23PM 5 points [-]

Typically, a conversion sticks because an organization provides value to its members.

People do get value from religion. The big two seem to be social conformity and fear of death, but there are others. The only atheist that I personally know who converted to Christianity got a wife out of the deal.

Comment author: Ishaan 15 September 2013 04:58:12PM *  0 points [-]

My uneducated guess is that it is because Christianity, Islam, Sikhism, Buddhism, and Judaism were all backed by governments and military forces during the initial stages of expansion. I don't believe there are any large religions for which this is not true - Hinduism is too old for us to say much about its origins, but there was a time when Buddhism was becoming extremely popular, and power was involved in re-establishing Hinduism.

If I'm right, then the thing that causes small memeplexes to become big memeplexes is the successful conversion of a few powerful and influential people (and that process happens through random drift in the case of religion)

Also, I think Christianity, Islam, and Judaism are the only religions which care about whether or not you believe them. (As in, members think that belief itself has consequences and so they aught to care what others believe). It's harder to leave these religions, with shadows of hell hanging over you. I think that in most other religions, people can sort of vaguely redirect worship from one set of symbols to another without really rejecting old beliefs and accepting new ones in a way that is consistent with "brainwashing" - it's more or less immaterial which religion they are following. I've got relatives who pray to little pictures of Jesus along with other Hindu idols, and I don't think they realize how odd this would seem to a Christian. The notion that deviation from a religious orthodoxy is bad tends to be absent, and I imagine that this makes conversion easier.

Comment author: Brillyant 16 September 2013 04:10:13AM 4 points [-]

Taboo "brainwashing".

What does Christianity, for instance, succeed in doing, if not brainwashing?

It seems to me that it's (sincere) adherents have been persuaded to believe Christianity is the most rational choice. They've been convinced it is the best wager available.

Is it? Is Christianity (or any religion) the best wager? Is it rational?

If not, then what can we say about the mechanism(s) used to get humans to be wholly convinced otherwise? What shall we name it? How does it work?

And how is this yet-nameless, magical process different from brainwashing?

Comment author: Viliam_Bur 16 September 2013 09:20:08AM *  4 points [-]

Religions succeed in making people believe in them. But how specifically? I propose three mechanisms:

First, by providing them some advantages. There may be advantages of the belief (belief in afterlife alleviates fear of death), advantages of explicitly commanded behavior (forbidding theft reduces costs of protecting property), other advantages (meeting every Sunday in church helps one meet their neighbors).

Second, by blackmailing them. If you leave the religion, you will be forever tortured in the hell; and in some situations your former friends will murder you.

Third, by modifying their thoughts or behavior into ones that make leaving less likely. For example teaching people that things can be "true" even if they are completely invisible and statistically undetectable (which makes them less likely to realize that the beliefs are wrong), by making them too busy to make any long-term thinking (such as whether to leave the religion), by removing information sources or friends that could provide information or encouragement against the religion.

If this reflects the reality well enough, I would suggest that the word "brainwashing" means using mainly the third and second kind of mechanisms (as opposed to mostly the first one, which feels legitimate). One religious group can give people friends and pleasant social activities, so the members are happy to be there. Other religious group can make them busy 16 hours a day (praying, meditating, converting new people, etc.) and destroy all out-group contacts, so the members' agency is crippled, and they stay even if they are unhappy.

Comment author: gwern 29 September 2014 05:58:17PM 3 points [-]

Has the rehabilitation of 'cults' begun? "The Cult Deficit", Ross Douthat:

LIKE most children of the Reagan era, I grew up with a steady diet of media warnings about the perils of religious cults — the gurus who lurked in wait for the unwary and confused, offering absolute certainty with the aftertaste of poisoned Kool-Aid. From the 1970s through the 1990s, from Jonestown to Heaven’s Gate, frightening fringe groups and their charismatic leaders seemed like an essential element of the American religious landscape. Yet we don’t hear nearly as much about them anymore, and it isn’t just that the media have moved on. Some strange experiments have aged into respectability, some sinister ones still flourish, but over all the cult phenomenon feels increasingly antique, like lava lamps and bell bottoms. Spiritual gurus still flourish in our era, of course, but they are generally comforting, vapid, safe — a Joel Osteen rather than a Jim Jones, a Deepak Chopra rather than a David Koresh.

...The decline of cults, while good news for anxious parents of potential devotees, might actually be a worrying sign for Western culture, an indicator not only of religious stagnation but of declining creativity writ large. The first writer is Philip Jenkins, a prolific religious historian, who argues that the decline in “the number and scale of controversial fringe sects” is both “genuine and epochal,” and something that should worry more mainstream religious believers rather than comfort them. A wild fringe, he suggests, is often a sign of a healthy, vital center, and a religious culture that lacks for charismatic weirdos may lack “a solid core of spiritual activism and inquiry” as well. The second writer is Peter Thiel, the PayPal co-founder, venture capitalist and controversialist, who includes an interesting aside about the decline of cults in his new book, Zero to One

...From the Franciscans to the Jesuits, groups that looked cultlike to their critics have repeatedly revitalized the Catholic Church, and a similar story can be told about the role of charismatic visionaries in the American experience. (The enduring influence of one of the 19th century’s most despised and feared religious movements, for instance, is the reason the state of Utah now leads the United States on many social indicators.)...When “people were more open to the idea that not all knowledge was widely known,” Thiel writes, there was more interest in groups that claimed access to some secret knowledge, or offered some revolutionary vision. But today, many fewer Americans “take unorthodox ideas seriously,” and while this has clear upsides — “fewer crazy cults” — it may also be a sign that “we have given up our sense of wonder at secrets left to be discovered.”

Comment author: gwern 14 January 2014 08:54:14PM 2 points [-]
Comment author: gwern 14 November 2013 06:56:31PM 2 points [-]

In Jin, Moon and his wife’s fourth child, seemed suited for the task. She had a modern American upbringing and a master’s degree from Harvard. In 2009, she took over the Unification Church of America and introduced a bold modernization program. Her aim, she said, was to transform the church into one that people—especially young people—were “dying to join.” She renamed the church Lovin’ Life Ministries, shelved the old hymn books, and launched a rock band, an offshoot of which played New York clubs under the moniker Sonic Cult. She also discarded the old Korean-inspired traditions: bows and chanting gave way to “Guitar Hero” parties, open mics, concerts, and ping-pong tournaments. What’s more, In Jin broke some long-standing taboos. Rather than adhering to the church line on arranged marriage, for example, she encouraged young people to play a role in choosing their own spouses. Her reforms were met with heated resistance. Across the country, Moon’s disciples took to the Internet to denounce In Jin’s “bling-bling” style and her “ridiculous accent.” One online critic dubbed her ministry the “mushroom church,” because “all you do is sit passively in the dark and are fed bovine excrement.” Within two years, nationwide monthly attendance plunged from roughly 26,000 to less than 7,500, according to internal church documents.

http://www.newrepublic.com/article/115512/unification-church-profile-fall-house-moon

In other words, some popularizing reforms which reduced apparent coercion and cultishness cut membership by 75% - more strikingly, despite being one of the most famous, notorious, politically influential 'cults', they were down to just 25k total in the USA in 2009.

Comment author: gjm 10 January 2017 04:04:20PM 0 points [-]

some popularizing reforms which reduced apparent coercion and cultishness cut membership by 75%

This seems like evidence against (a perhaps overstrong version of) the thesis of the OP, namely that cult "techniques" are ineffective. But note that

  • it's perfectly consistent with them not being scarily effective; and
  • it's also possible that these changes made no difference to (or even increased) the Moonies' ability to acquire new members and keep them in the short term, and that it cut their membership because longstanding members who were used to the old way of doing things hated the reforms.
Comment author: buybuydandavis 13 September 2013 09:22:56PM 5 points [-]

Cultist, n. One who is obstinately and zealously attached to an opinion that you do not entertain.

(That's actually "bigot" in the Devil's Dictionary, but cultist is a better fit to me.)

Comment author: wedrifid 14 September 2013 01:55:59AM *  0 points [-]

In the translation from 'bigot' to 'cultist' we could perhaps add "or group you do not approve of".

Comment author: JQuinton 17 September 2013 01:53:01PM *  3 points [-]

This is an excerpt from Valerie Tarico's web series "Christian Belief Through The Lens of Cognitive Science"

In revival meetings or retreats, semi-hypnotic processes draw a potential convert closer to the toggle point. These include including repetition of words, repetition of rhythms, evocative music, and Barnum statements (messages that seem personal but apply to almost everyone– like horoscopes). Because of the positive energy created by the group, potential converts become unwitting participants in the influence process, actively seeking to make the group’s ideas fit with their own life history and knowledge. Factors that can strengthen the effect include sleep deprivation or isolation from a person’s normal social environment. An example would be a late night campfire gathering with an inspirational story-teller and altar call at Child Evangelism’s “Camp Good News.”

These powerful social experiences culminate in conversion, a peak experience in which the new converts experience a flood of relief. Until that moment they have been consciously or unconsciously at odds with the group center of gravity. Now, they may feel that their darkest secrets are known and forgiven. They may experience the kind of joy or transcendence normally reserved for mystics. And they are likely to be bathed in love and approval from the surrounding group, which mirrors their experience of God.

Also, military basic training seems to employ some of these methods too:

To do this, however, we need a form of psychological training that is able to forge individuals who can do this. That is why boot camp has evolved to become such a potent tool in today's military machine.

The most important single thing to know about boot camp is that it is 100 percent designed to reprogram children and civilians into warriors. It places within them a sense that they are expected to do important things, far more important things than could be expected from other 18-year-olds. This is all happening during one of the most intensely stressful periods of your life, when you are kept isolated from contact from your family and friends and taught that everything you were before entering the Marines was weak and lacking any real value until you too are a Marine. Cults are made this way too. I'm just saying. But in all seriousness, the psychological transformation of boot camp is a very intense and intentional effort by the Marine Corps to make warriors able to fight and kill out of kids who have just barely left high school. From the point that you graduate boot camp, you will be different and have parts of the Marine Corps culture as part of your psyche.

[...]

Now we move on to something else very important and why I say that it is "psychological" retraining. You go through the next few days running from place to place, doing this, that, this, that and you won't even realize ... you haven't slept in three days. Yeah, you will go about three days without sleep upon arrival. The whole time you are completely exhausted while running on adrenaline and hearing over and over, that you are inferior. Inferior to real Marines, which you aren't yet. You aren't thinking about it, but it is sinking in. You are completely tired and these things build up. Without realizing it, you start to believe that that which is being told to you is true, that there is a weakness in you and that you are less than perfect. In your current state, you believe them and that you must change to be good enough.

(Caveat: I've been through bootcamp)

I'm not sure you could call this brainwashing, though. Not any more than you can call singing and dancing in synchrony brainwashing or doing extreme rituals. Like someone else said, taboo the word "brainwashing"; the word itself has a bunch of negative connotations. Brainwashing in the popular sense also assumes a sort of permanence, which is probably a strawman of what's actually going on.

Comment author: BrotherNihil 15 September 2013 12:57:16AM *  3 points [-]

My observation about cults, from personal experience leading them, is that they are a totally normal mode of human operation. People are always looking for strong leaders with vision, passion and charisma who can organize them for a larger purpose. What distinguishes a cult from a non-cult is that they are outside the norms of the mainstream society (as established by the dominant cults -- i.e. "the culture"). "Cult", "brainwashing", "deprogramming", etc. are terms of propaganda used by the dominant culture to combat competing memeplexes.

I think of cults as testbeds for new civilizations and new ways of life. In times of change, when the old ways are failing and the civilization is falling, cults may be well-positioned to expand and become the new normal. I suppose this is the memetic equivalent of marginal species who exploit mass extinctions to become genetically dominant -- cults provide memetic diversity. This is apparently what was going on in the declining years of Rome, and I see indications that something similar is happening today.

Comment author: Dahlen 15 September 2013 06:49:09PM *  11 points [-]

My observation about cults, from personal experience leading them

* raises eyebrow *

Comment author: niceguyanon 16 September 2013 08:57:54PM 5 points [-]

From BrotherNihil's website:

Lately I've been thinking a lot about how one could go about becoming an online Mohammed or Genghis Khan – a great leader who sends forth an army of trolls to conquer web sites for the Religion and the Empire. I don’t think it has been tried, but think it may be possible.

He wasn't kidding about the personal experience.

I say this because I find it quite easy to go to a web site and to begin to control the debate, stir up dissent, refute ideologies, recruit people, or otherwise manipulate the site as I see fit.

Heh... goodluck with that here on LW.

Depending on the level of moderation, this may have to be done subtly, but there is always a way to counter whatever propaganda is being spread at a given site and to inject some of your own counter-propaganda. If one is clever and persistent, one should in this way be able to alter, destroy or co-opt any site according to one's agenda.

I take the crackpottery of his site as evidence to not take much of what he says seriously.

Comment author: Viliam_Bur 15 September 2013 06:40:53PM 2 points [-]

"Cult", "brainwashing", "deprogramming", etc. are terms of propaganda used by the dominant culture to combat competing memeplexes.

There is something like manipulation. To make this a discussion about anticipated experience, here is an experiment proposal:

Kidnap a few new members from different religious organizations. (It's just an imaginary experiment.) Keep them for one week isolated from their religious groups: no personal contact, no phone, no books. If they start to do some rituals they were told to do, for example repeat a mantra or sing a song, prevent them from doing so. Otherwise, don't do them any harm, and keep them in a nice environment. -- When the week is over, just let them go. Observe how many of them return to the original group. Compare with a control group of randomly selected people you didn't kidnap; how many of them remained in the group after the week. Are there statistically significant differences for different religious groups?

My prediction is that there would be observable differences for different religious groups. I believe there is some pressure involved in the process of recruitment in some religious (or not just religious) groups; some algorithm which increases the chances of membership when done properly, and fails when interrupted. Perhaps "brainwashing" is too strong word, but it a kind of manipulation. It consists of pushing the person towards more expressions of commitment, without giving them time to reflect whether they really want it (whether it is okay with their other values).

Comment author: gwern 15 September 2013 06:49:15PM 2 points [-]

Kidnap a few new members from different religious organizations. (It's just an imaginary experiment.)

This is pretty similar to what the deprogrammers did. They didn't have too high success rates.

Comment author: [deleted] 15 September 2013 07:30:50PM 2 points [-]

People like to resist coercion. Reactions to being kidnapped in order to be forced to abandon the cult could be different than reactions to being kidnapped and held for a week by a mad psychologist with a mysterious agenda. Though for the agenda to be mysterious, the idea of preventing them from engaging in rituals would have to be abandoned.

Comment author: shminux 13 September 2013 10:04:15PM 3 points [-]

Very interesting and surprising. A priori I would have expected the most successful NRMs to be at least 10-20% effective in one-year new member retention. I wonder how non-religious non-mainstream organizations that demand some amount of sacrifice from their members measure up? E.g. what are the retention rates in online forums, gaming communities, fitness centers, etc...?

Comment author: James_Miller 13 September 2013 10:34:33PM 1 point [-]

I wonder what percentage of adult North Koreans have been successfully brainwashed by their government to the extent that, say, they believe that their country's founding dictator was one of the greatest forces for good the world has ever known. What's your estimate?

Submitting...

Comment author: [deleted] 14 September 2013 03:56:34PM 6 points [-]

No “I'm not going to vote; just show me the results” option?

Comment author: gwern 14 September 2013 12:04:21AM 14 points [-]

In the Korean context, surveys have been done of defectors (for the obvious reasons) to try to gauge the current level of support for the regime. The result is sadly predictable for anyone who's seen Russians nostalgic for Stalin or Chinese wistfully thinking back to Mao: Il-Sung is still venerated by many North Koreans, even if they don't like his son or despise the pig-grandson.

Some survey data is summarized in The Hidden People of North Korea: Everyday Life in the Hermit Kingdom and "An Assessment of the North Korean System's Durability" is an extensive discussion of defector surveys. (Apparently in the 2002 defector survey, 67% of them believed their countrymen venerated Il-Sung as the "greatest mind of humanity". Many interesting bits, like "Few North Koreans seem aware that the United States has been one of North Korea's principal food donors.")

Comment author: gwern 28 September 2013 11:49:37PM 5 points [-]

From a new paper, "Preparing for the Possibility of a North Korean Collapse", Bennett 2013 (RAND):

...Since the end of the Korean War, the North Korean government has indoctrinated its population, only allowing them access to state-generated information. But information on the outside is spreading in North Korea, debunking at least some of the North Korean propaganda, and generating the potential for instability: “There is mounting evidence that Kim Jong Il is losing the propaganda war inside North Korea, with more than half the population now listening to foreign news, grass-roots cynicism undercutting state myths and discontent rising even among elites.”53 Analyzing the results of their survey of North Korean refugees in China and South Korea, Marcus Noland and Stephen Haggard have identified a number of significant shifts in information and resulting North Korean attitudes:

  • The survey found that roughly half of North Koreans have access to foreign news or entertainment, a sharp rise from the 1990s, eroding faith in the regime’s statements that the United States is causing its woes.54
  • “Not only is foreign media becoming more widely available, inhibitions on its consumption are declining as well,” the report said, referring to broadcasts from South Korea, China and the United States. “The availability of alternative sources of information undermines the heroic image of a workers’ paradise and threatens to unleash the information cascade that can be so destabilizing to authoritarian rule.”55
  • A survey of refugees has found that “everyday forms of resistance” in the North are taking root as large swaths of the population believe that pervasive corruption, rising inequity and chronic food shortages are the fault of the government in Pyongyang—and not of the United States, South Korea or other foreign forces. . . .
  • “Evaluations of the regime appear to be getting more negative over time,” the report said. “Although those who departed earlier were more willing to entertain the view that the country’s problems were due to foreigners, respondents who left later were more likely to hold the government accountable.” . . .
  • The survey found that cynicism about the government—and willingness to crack jokes about its failures—was higher among refugees who come from elite backgrounds in the government or military. It also found that distaste for the government was strongest among those deeply involved in the markets.56

...With much more outside information penetrating into the North Korean society, a significant number of citizens likely believe at least parts of that information:

The regime has made desperate and increasingly futile efforts to maintain a stranglehold on information, such as periodic crackdowns by the authorities on mobile phones brought in from China and seizures of widely popular and avidly watched South Korean soap operas recorded on video and DVD.57

Even the North Korean military is not exempt:

An increasing number of North Korean military officers and soldiers are caught watching South Korean films or soap operas in barracks, sources say. A Beijing-based source who visits the North often said Monday, “Several Army officers and soldiers have been caught watching South Korean movies or TV dramas since last year, and the military has been providing extensive indoctrination for all officers and soldiers with a view to preventing the cultural infiltration of imperialism.”58

Corruption in the army has become so widespread that the government authorized the civilian police (the People’s Safety Agency) to investigate cases of corrupt military personnel. Previously, the military police handled such investigations, but the government believes the military police have become corrupted, and can no longer be trusted to find and punish soldiers involved in criminal acts (stealing, or aiding smugglers to get across the border). All this reflects poorly on the National Security Agency (secret police), who are also seen as corrupted.59

...Some early defectors in 1987 said, “[w]hen we lived in the North, we were told that South Korea was a living hell.”5 A defector in 2006 said, “When I came to the South and saw how rich it was, I was very angry at the Pyongyang regime.”6 The influx into the North of information about South Korea has weakened this propaganda line. While it is still repeated on occasion, now North Koreans are told that the South

has lost its true national identity, so its inhabitants are full of admiration toward the spiritual purity of their Northern brethren. The southerners, the propaganda claims, also badly want to purify themselves under the wise guidance of the Dear Leader Kim Jong-il (allegedly a cult figure in both the South and the North).7

Brian Myers, another remarkable specialist on North Korean culture and propaganda (not quite distinguishable areas, actually), recently wrote at length about a change of tune in Pyongyang propaganda: South Korea ceased to be depicted as the living hell, the land of depravation. The new image of the South is that of the country whose population secretly (or even not so secretly) longs to join its Northern brethren in their happiness under the wise care of the Beloved General.8

Apparently, DVDs and other information from the ROK have penetrated so much into North Korea that the argument of ROK impoverishment is not credible with many in the North and undermines overall North Korean propaganda. So an alternative approach is being taken to keep the multidimensional propaganda approach viable, claiming that the ROK is now poor in wise guidance and leadership.

Comment author: DanArmak 14 September 2013 08:47:01AM 2 points [-]

And that's just for defectors, which must be a selection effect in favour of being against Il-Sung.

Comment author: DavidAgain 14 September 2013 02:49:06PM 7 points [-]

Note that the survey says that they believe that their [i]countrymen[/i] venerated Il-Sung. Defectors may be likely to dislike Il Sung themselves, but my (low certainty) expectation would be that they'd be more likely to see the population at large as slavishly devoted. People who take an unusual stance in a society are quite likely to caricature everyone else's position and increase the contrast with their own. Mind you, they sometimes take the 'silent majority' thing of believing everyone secretly agrees with them: I don't know which would be more likely here.

But I'd guess that defectors would be both be more likely to think everyone else is zealously loyal, AND be more likely to believe that everyone wishes they could overthrow the government. I'd imagine them to be more likely to end up on the extremes, in short.

Comment author: shminux 13 September 2013 11:47:57PM 6 points [-]

Not sure what the purpose of this poll is. Brainwashing from birth with little or no exposure to alternative views is a quite different environment from the one NRMs operate in. How many Americans or Greeks (or pre-war Germans) believe that their country is the greatest? How many Russians believed in Communism in 1950s? The numbers are clearly a lot higher than any cult can hope to achieve.

Comment author: gwern 14 September 2013 12:11:38AM 13 points [-]

In particular, North Korea clamps heavily down on unauthorized information and makes up a lot of stuff. When your data is bad, it's not too surprising if your conclusions are bad.

Even people who are cynical about the regime probably aren't cynical enough. I forget the book I read this in (The Cleanest Race?) but I recall reading one story about a high-level NK official who was aware of the many abuses, but it wasn't until he learned from the Russian archives that the Korean War had actually been started by Kim Il-Sung after Stalin gave his permission (the official NK version is that the bloodthirsty capitalist SK dictator Syngman Rhee invaded NK unprovoked) that he realized just how far down the rabbit hole he had to go.

Comment author: Protagoras 14 September 2013 02:54:20AM *  3 points [-]

Admittedly, from what I recall of Rhee, it's likely that the only reason he didn't invade the North is because he knew how badly he'd lose; it's totally something he would have done if he'd had a better military.

Comment author: ikrase 14 September 2013 07:56:22AM 0 points [-]

Yeah, it's actually enough to make me wonder if just forcing information into the country would trigger a rebellion...

Comment author: DavidAgain 14 September 2013 02:45:10PM 3 points [-]

I don't think 'brainwashing' is a helpful or accurate term here, in the sense that I think most people mean it (deliberate, intensive, psychological pressure of various kinds). Presumably most North Koreans who believe such a thing do so because lots of different authority sources say so and dissenting voices are blocked out. I'm not sure it's helpful to call this 'brainwashing', unless we're going to say that people in the middle ages were 'brainwashed' to believe in monarchy, or to be racist, or to favour their country over their neighbours etc.

Even outside of repressive regimes, there are probably a whole host of things that most Americans believe that most Brits don't and vice versa, and that's in a case with shared language and culture. I'm not sure 'brainwashing' can be used just because lots of people in one place believe something that hardly anyone from outside does.

Comment author: TheAncientGeek 13 May 2014 06:51:56PM 1 point [-]

There's two theories here. One is that brainwashing is a rare and ineffective thing, The other is that accultauration, or whanever is.pervasive and effective and largely unnoticed, and the reason the NRMs aren't too effective is that the standard societal indoctrination is hard to budge.

Comment author: private_messaging 14 September 2013 01:38:40PM *  0 points [-]

I would estimate 66% or so, on the basis that a multitude of experiments found that about 2/3 of people are considerably more susceptible to authority than the rest, but I am not sure to which extent they managed to kill off the 1/3 , or to which extent the 1/3's conditional compliance counts towards "successfully brainwashed". edit: ahh, you say founding dictator. Well, then it could easily be higher, because it's a much less practical thing to think rebellious thoughts about right now.

Comment author: wedrifid 14 September 2013 01:47:16PM 8 points [-]

about 2/3 of people are more susceptible to authority than the rest

It would seem that one could replace "2/3" with any other proper fraction and that finding would remain true.

Comment author: 4hodmt 14 September 2013 04:41:09PM 2 points [-]

Editing the quote to remove the "considerably" changes the meaning. The original is not a tautology because the "considerably" suggests a visible step in the curve.

Comment author: wedrifid 15 September 2013 12:54:10AM 4 points [-]

Editing the quote to remove the "considerably" changes the meaning. The original is not a tautology because the "considerably" suggests a visible step in the curve.

I didn't remove a word. The original was edited to change the meaning.

Comment author: private_messaging 16 September 2013 12:01:51PM *  1 point [-]

Yea, you merely interpreted it in a ridiculous way that was not intended, thus requiring an extra word where none would have been needed if maxim of relevance at all held.

Comment author: wedrifid 16 September 2013 12:20:05PM 1 point [-]

Yea, you merely interpreted it in a ridiculous way that was not intended, thus requiring an extra word where none would have been needed if maxim of relevance at all held.

Your edited version is far more useful. Thankyou.

Comment author: 4hodmt 15 September 2013 02:51:17AM 0 points [-]

My apologies then. It would be useful if LessWrong marked edited posts as edited.

Comment author: Douglas_Knight 15 September 2013 03:24:40AM *  4 points [-]

It does mark edited comments, by an * after the date. It does not mark edits to top-level posts or edits by admins (even self-edits by admins, which is clearly a bug).

Comment author: 4hodmt 15 September 2013 03:45:19AM 2 points [-]

Thanks, I didn't notice the '*'s.

Comment author: Mestroyer 14 September 2013 06:12:48PM *  0 points [-]

private_messaging's post is edited. I bet wedrifid quoted it as it originally was, and private_messaging edited it later to change the meaning. Edit2: (To change my posts's meaning, heh) or to clarify the original intended meaning.

Edit: fixed formatting error caused by not escaping the underscore private_messaging's name.

Comment author: ChristianKl 14 September 2013 08:11:24PM 0 points [-]

If there a visible step in the curve that would be interesting. If anyone has any sources that makes such a claim, please provide it.

Comment author: private_messaging 14 September 2013 02:34:17PM *  1 point [-]

Well, it still seems odd that with different set ups of e.g. Milgram experiment, various conformity experiments, and such, around 2/3 is the number rather than some dramatically different fraction (which suggests that in practice the change in susceptibility is greater around that percentile, which is of course what I meant). There really is no data to use to get any sort of specific number for North Korea, at all, but if you have to guess you have to name something. I'd be cautious of over-estimating the power of brainwashing over there. Especially considering how many people they did have to put through prison camps and such.

Comment author: ChristianKl 14 September 2013 08:13:03PM 1 point [-]

Depending on the specifics which get used during the Milgram experiment you get different results. It matters whether the person being tortured is in the same room. Whether or not you use a setting that gives you 2/3 of the people is arbitary.

Comment author: private_messaging 14 September 2013 03:05:21PM *  0 points [-]

It always seemed obvious to me that cults have rather low conversion rates.

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

(I tend to delineate cults/non-cults on the basis of how they resolve this trade-off between extremism and popularity)

Comment author: gwern 14 September 2013 03:34:41PM 7 points [-]

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

No one in the literature suggests this, and cults (just like mainstream religions such as Mormonism) invest enormous efforts into proselytization, rather than strenuous filtering of existing converts. The efforts just don't succeed, and like the Red Queen, minority religions need to run as fast as they can just to stay in place.

Comment author: private_messaging 14 September 2013 04:44:19PM *  2 points [-]

The low rate of retention is extreme filtering. The cults try to get members to sever ties with the family and friends, for example - and this is a filter, most people get creeped out and a few go through with it. edit: and of course, with such extreme filtering, one needs a lot of proselytism to draw just a hundred very dedicated supporters.

Comment author: gwern 14 September 2013 06:35:55PM 4 points [-]

The low rate of retention is extreme filtering.

You are arguing by definition here; please consider what could falsify your mental model of cults. If my local gym discovers only 1% of the people joining after New Years will stick around for more than a year, does that necessarily imply that the gym is ruled by a charismatic leader driving people away so as to maximize the proportion of unthinkingly loyal subordinates?

Low rate of retention is simply low rate of retention. This can be for a great many reasons, such as persecution, more attractive rival organizations, members solving their problems and leaving, or (way down the list) extreme filtering for loyalty which drives away otherwise acceptable members. How often do you see a cult leader going 'well, sure, we could have thousands more members if we wanted (people are pounding down the doors to convert), and majorly increase our donations and financial holdings, but gosh, we wouldn't want to sell out like that!'

Of course, like any organization, there's concerns about freeriding and wasting club goods and it'll seek to strike a balance between inclusiveness and parasite load; but a cult which has 'successfully' shed all but a few fanatics is a cult which is about to become history.

The cults try to get members to sever ties with the family and friends, for example

Recruiting through family and friends is a major strategy of cults - indeed, perhaps the only strategy which does not have abysmally low success rates.

Comment author: private_messaging 14 September 2013 06:59:26PM 1 point [-]

Low rate of retention is a product of many reasons simultaneously, including the extreme weird stuff creeping people out. If your local gym is creepy, it will have lower retention rate, than same gym that is not creepy.

My mental model of failed retention includes the general low retention rate, in combination with the weird things that cult does creeping people out, on top of that.

How often do you see a cult leader going 'well, sure, we could have thousands more members if we wanted (people are pounding down the doors to convert), and majorly increase our donations and financial holdings, but gosh, we wouldn't want to sell out like that!'

I rarely see people reflect on their motives or goal structure. You often see a cult leader abusing a cultist, which leads insufficiently dedicated cultists to leave. Such actions sacrifice quantity for "quality".

Recruiting through family and friends is a major strategy of cults - indeed, perhaps the only strategy which does not have abysmally low success rates.

Yes, and a lot of the time that fails, and the family members start actively denouncing the cult, and the member has to choose between the family and friends, and the cult, at which point, well, few choose the cult.

Comment author: gwern 14 September 2013 10:31:36PM 2 points [-]

Low rate of retention is a product of many reasons simultaneously, including the extreme weird stuff creeping people out.

As pointed out in the OP by one author, the cults in question have in many ways been assimilated by the mainstream and so are far less 'weird' than ever before. Has that helped their retention rates? Environmentalism and meditation are completely mainstream now, have the Hare Krishnas staged a comeback?

If your local gym is creepy, it will have lower retention rate, than same gym that is not creepy.

The counterfactual is not available or producible, and so this is meaningless to point out. If the Hare Krishnas did not hold 'creepy' beliefs, in what sense is this counterfactual organization similar to the Hare Krishnas? If Transcendental Meditators did not do as weird a thing as meditate, how are they Transcendental Meditators? Defining away all the unique characteristics does not add any insight.

You often see a cult leader abusing a cultist, which leads insufficiently dedicated cultists to leave.

"You often see a boss abusing a subordinate, which leads insufficiently dedicated employees to leave. This is because bosses wish to sacrifice quantity and being able to handle work for 'quality' of subordinates."

No, there is nothing unique about cults in this respect. Monkeys gonna monkey. And for the exact same reason businesses do not casually seek to alienate 99% of their employees in order to retain a fanatical 1%, you don't see cults systematically organization-wide try to alienate everyone. You see a few people in close proximity to elites being abused. Just like countless other organizations.

the member has to choose between the family and friends, and the cult, at which point, well, few choose the cult.

Which explains the success of deprogrammers, amirite?

Comment author: private_messaging 14 September 2013 11:25:17PM *  -1 points [-]

I don't see how environmentalism or for that matter meditation itself is creepy.

What's creepy about Hare Krishnas is the zoned out sleep deprived look on the faces (edit: I am speaking of the local ones, from experience), and the whole obsession with the writings of the leader thing, and weirdly specific rituals. Now that environmentalism and meditation are fairly mainstream, you don't have to put up with the creepy stuff if you want to be around people who share your interests in environmentalism and meditation. You have less creepy alternatives. You can go to a local Yoga class, that manages to have same number of people attending as the local Khrishna hangout, despite not trying nearly as hard to find new recruits. You can join a normal environmentalist group.

No, there is nothing unique about cults in this respect. Monkeys gonna monkey. And for the exact same reason businesses do not casually seek to alienate 99% of their employees in order to retain a fanatical 1%, you don't see cults systematically organization-wide try to alienate everyone. You see a few people in close proximity to elites being abused. Just like countless other organizations.

The difference is, of course, in extent. For example, putting up a portrait of the founder at every workplace (or perhaps in a handbook, or the like) would be something that a cult leader would do in a cult, but what a corporation would seldom ever do because doing so would be counter-productive.

edit: actually. What do you think makes joining a cult worse than joining a club, getting a job, and so on? Now, what ever that is, it makes it harder to get new recruits, and requires more dedication.

Comment author: gwern 15 September 2013 06:35:22PM *  3 points [-]

I don't see how environmentalism or for that matter meditation itself is creepy.

Which goes to show how far into the zeitgeist they've penetrated. Go back to the 1960s when the cult panic and popular image of cults was being set, and things were quite different. One of the papers discusses a major lawsuit accusing the Hare Krishnas of 'brainwashing' a teen girl when she ran away from home and stayed with some Krishnas; the precipitating event was her parents getting angry about her meditating in front of a little shrine, and ripping it out and burning it (and then chaining her to the toilet for a while). To people back then, 'tune in, turn on, drop out' sounds less like a life choice than a threat...

What's creepy about Hare Krishnas is the zoned out sleep deprived look on the faces (edit: I am speaking of the local ones, from experience)

Well, I can hardly argue against your anecdotal experiences.

the whole obsession with the writings of the leader thing,

Supreme Court - jurists or cultists? Film at 11. We report, you decide.

and weirdly specific rituals.

I don't even know what 'weirdly specific' would mean. Rituals are generally followed in precise detail, right down to the exact repetitive wording and special garments like Mormon underpants; that's pretty much what distinguishes rituals from normal activities. Accepting Eucharist at mass? Ritual. Filling out a form at the DMV? Not ritual.

You can go to a local Yoga class, that manages to have same number of people attending as the local Khrishna hangout, despite not trying nearly as hard to find new recruits.

Hmm, where was one to find yoga back then... Ah yes, also in cults. Ashrams in particular did a lot of yoga. Interesting that you no longer have to go to an ashram or fly to India if you want to do yoga. It's almost like... these cult activities have been somehow normalized or assimilated into the mainstream...

You can join a normal environmentalist group.

And where did these environmentalist groups come from?

For example, putting up a portrait of the founder at every workplace (or perhaps in a handbook, or the like) would be something that a cult leader would do in a cult, but what a corporation would seldom ever do because doing so would be counter-productive.

Really? That seems incredibly common. Aside from the obvious examples of many (all?) government offices like post offices including portraits of their supreme leader - I mean, President - you can also go into places like Walmart and see the manager's portrait up on the wall.

What do you think makes joining a cult worse than joining a club, getting a job, and so on?

Personally? I think it's mostly competition from the bigger cults. Just like it's hard to start up a business or nonprofit.

Comment author: private_messaging 15 September 2013 06:45:08PM *  -2 points [-]

What do you think makes joining a cult worse than joining a club, getting a job, and so on?

Personally? I think it's mostly competition from the bigger cults. Just like it's hard to start up a business or nonprofit.

That doesn't even make sense as an answer. Rest likewise doesn't seem in any way contradictory to the point I am making, but is posed as such.

Comment author: gwern 15 September 2013 06:47:41PM *  3 points [-]

That doesn't even make sense as an answer.

Of course it makes sense. As I've already claimed, cults are not engaged in some sort of predatory 'brainwashing' where they exploit cognitive flaws to just moneypump people with their ultra-advanced psychological techniques: they offer value in return for value received, just like businesses need to offer value to their customers, and nonprofits need to offer some sort of value to their funders. And these cults have plenty of established competition, so it makes sense that they'd usually fail. Just like businesses and nonprofits have huge mortality rates.

Rest likewise doesn't seem in any way contradictory to the point I am making, but is posed as such.

I've given counter-examples and criticized your claims. Seems contradictory to me.

Comment author: Luke_A_Somers 16 September 2013 03:39:30PM 0 points [-]

I wasn't around in the 60s and wasn't aware for any of the 70s, but... Environmentalism seems qualitatively different from everything else here. Is there some baggage to this beyond, say, conservation, or assigning plants and animals some moral weight, that is intended here?

Something may have seemed weirder in the past because it was weirder back then.

I suspect few modern Christians would sign up for AD 200 Christianity.

Comment author: gwern 16 September 2013 04:33:59PM 3 points [-]

Environmentalism seems qualitatively different from everything else here. Is there some baggage to this beyond, say, conservation, or assigning plants and animals some moral weight, that is intended here?

Not really, aside from the standard observation that you can just as easily play the 'find cult markers' game with environmental groups like Greenpeace or ELF. Cleansing rituals like recycling, intense devotion to charismatic leaders, studies of founding texts like Silent Spring, self-abnegating life choices, donating funds to the movement, sacralization of unusual objects like owls or bugs, food taboos ('GMOs'), and so on and so forth.

Comment author: Jiro 15 September 2013 05:05:34PM -1 points [-]

Environmentalism and meditation are completely mainstream now, have the Hare Krishnas staged a comeback?

I would suggest that if beliefs believed by cults becoime mainstream, that certainly decreases one barrier to such a cult's expansion, but because there are additional factors (such as creepiness) that alone is not enough to lead the cult to expand much. It may be that people's resistance to joining a group drastically increases if the group fails any one of several criteria. Just decrementing the number of criteria that the group fails isn't going to be enough, if even one such criterion is left.

"You often see a boss abusing a subordinate, which leads insufficiently dedicated employees to leave. This is because bosses wish to sacrifice quantity and being able to handle work for 'quality' of subordinates.

The level of abuse done by bosses and cult leaders is different, so although the statement is literally true for both bosses and cult leaders, it really doesn't imply that the two situations are similar.

Comment author: gwern 15 September 2013 06:45:06PM 1 point [-]

It may be that people's resistance to joining a group drastically increases if the group fails any one of several criteria.

Maybe, but I don't know how we'd know the difference.

The level of abuse done by bosses and cult leaders is different, so although the statement is literally true for both bosses and cult leaders, it really doesn't imply that the two situations are similar.

Is it really? Remember how many thousands of NRMs there are over the decades, and how people tend to discuss repeatedly a few salient examples like Scientology. Can we really compare that favorably regular bosses with religious figures? Aside from the Catholic Church scandal (with its counterparts among other closemouthed groups like Jewish and Amish communities), we see plenty of sexual scandals in other places like the military (the Tailhook scandal as the classic example, but there's plenty of recent statistics on sexual assault in the military, often enabled by the hierarchy).

Comment author: ChristianKl 14 September 2013 08:26:46PM 2 points [-]

The cults try to get members to sever ties with the family and friends, for example - and this is a filter, most people get creeped out and a few go through with it.

I'm not sure whether that's true. You have people on LessWrong talking about cutting family ties with nonrational family members and nobody get's creeped out.

I don't think I have ever witnessed people getting creeped out by such discussions in the self help area and I think I have frequently heard people encouraging others to cut ties with someone that "holds them back".

Comment author: yli 15 September 2013 01:42:40AM *  5 points [-]

Really? Links? A lot of stuff here is a bit too culty for my tastes, or just embarassing, but "cutting family ties with nonrational family members"?? I haven't been following LW closely for a while now so I may have missed it, but that doesn't sound accurate.

Comment author: Douglas_Knight 15 September 2013 03:38:52AM 3 points [-]

Here's an example.

Comment author: Mestroyer 15 September 2013 02:56:56PM 3 points [-]

diegocaleiro didn't just say they were just irrational:

(1) Stupid (2) Religious (3) Non-rationalists (4) Absolutely clueless about reality (5) Pushy about inserting their ideas/ideals/weltenshaaung/motifs into you?

I strongly suspect that this isn't a case of "My family members don't believe as I do, therefore fuck those guys." but rather "These family members know that I am nonreligious and aggressively proselytize because of it." This probably isn't even about rationality or LessWrong, rather atheism.

Note also that it is diegocaleiro who initiated the conversation, and note the level of enthusiasm about the idea received from other posters (Only ChristianKI and Benito's responses seem wholly in favor, VilliamBur and drethelin's responses are against, shminux and BenLandauTaylor's responses are neutral).

Comment author: Eugine_Nier 15 September 2013 03:37:49PM 4 points [-]

"These family members know that I am nonreligious and aggressively proselytize because of it."

Outside view: These family members know that [diegocaleiro joined a group with weird non-mainstream religious beliefs] and [are trying to deconvert him].

Comment author: yli 15 September 2013 04:01:21AM *  2 points [-]

Thanks for the link. I don't really see creepy cult isolation in that discussion, and I think most people wouldn't, but that's just my intuitive judgment.

Comment author: ChristianKl 15 September 2013 09:56:07AM 6 points [-]

That's the point. It doesn't look that way from the inside.

If someone would tell those family members that the OP cutted their family ties with them because he made a rational analysis with help from his LessWrong friends those family member might see it as an example of the evil influence that LessWrong has on people.

Comment author: Costanza 14 September 2013 08:48:23PM 6 points [-]

I'm at least mildly creeped out by occasional cultish behavior on LessWrong. But every cause wants to be a cult

Eliezer said so, so therefore it is Truth.

Comment author: wedrifid 15 September 2013 02:10:17AM *  -1 points [-]

I'm not sure whether that's true. You have people on LessWrong talking about cutting family ties with nonrational family members and nobody get's creeped out.

I do not believe you. If it is the case that people talk about cutting family ties with 'nonrational family members' then there will be people creeped out by it.

Note that if the 'nonrational' family members also happen to be emotionally abusive family members this would not match the criteria as I interpret it. (Even then I expect some people to be creeped out by the ties cutting and would expect myself to aggressively oppose such expressions so as to suppress a toxic influence.)

Comment author: Eugine_Nier 15 September 2013 12:55:11PM 5 points [-]

Note that if the 'nonrational' family members also happen to be emotionally abusive family members this would not match the criteria as I interpret it.

You do realize that a lot of cults tend to classify normal family reactions, e.g., attempting to get the person out of the cult, as emotional abuse.

Comment author: wedrifid 15 September 2013 01:33:36PM *  3 points [-]

You do realize that a lot of cults tend to classify normal family reactions, e.g., attempting to get the person out of the cult, as emotional abuse.

I don't care and I'm somewhat outraged at this distortion of reasoning. It is so obviously bad and yet remains common and is all too seldom refuted. Emotional abuse is a sufficiently well defined thing. It is an undesirable thing. Various strategies for dealing with it are possible. In severe cases and in relationships where the gains do not offset the damage then severing ties is an appropriate strategy to consider. This doesn't stop being the case if someone else also misuses the phrase 'emotional abuse'.

Enduring emotional abuse rather than severing ties with the abuser because sometimes cultists sever ties while using that phrase is idiotic. Calling people 'creepy' for advocating sane, mainstream interpersonal strategies is absurd and evil.

Comment author: Kaj_Sotala 18 September 2013 05:44:00AM *  4 points [-]

I don't care and I'm somewhat outraged at this distortion of reasoning. It is so obviously bad and yet remains common and is all too seldom refuted.

Sorry, exactly what is it that you're outraged about? Eugene seemed to merely be pointing out that people inside particular social groups might see things differently than people outside them, with the outsiders being creeped out and insiders not being that. More specifically, that things that we deem okay might come off as creepy to outsiders. That seems correct to me.

Comment author: wedrifid 18 September 2013 06:44:59AM 0 points [-]

Sorry, exactly what is it that you're outraged about?

As a general policy:

  • All cases where non-sequitur but technically true claims are made where the actual implied rhetorical meaning is fallacious. Human social instincts are such that most otherwise intelligent humans seem to be particularly vulnerable to this form of persuasion.
  • All arguments or insinuations of the form "Hitler, Osama Bin Laden and/or cultists do <something superficially similar to X>. Therefore, if you say that <X> is ok then you are Bad."
  • Additional outrage, disdain or contempt applies when:
    • The non-sequitur's are, through either high social skill or (as in this case) plain luck, well calibrated to persuade the audience despite being bullshit.
    • Actual negative consequences can be expected to result from the epistemic damage perpetrated.
Comment author: Kaj_Sotala 20 September 2013 11:38:45AM *  5 points [-]

Thanks, that sounds reasonable. I didn't interpret Eugene's comments as being guilty of any of those, though.

Comment author: Eugine_Nier 19 September 2013 07:28:43AM -1 points [-]

All cases where non-sequitur but technically true claims are made where the actual implied rhetorical meaning is fallacious. Human social instincts are such that most otherwise intelligent humans seem to be particularly vulnerable to this form of persuasion.

In my experience nearly all accusations that someone is being "emotionally abusive" are of this type.

Comment author: Eugine_Nier 15 September 2013 03:29:46PM 6 points [-]

Emotional abuse is a sufficiently well defined thing. It is an undesirable thing.

So could you provide a definition. The article you linked to begins by saying:

As of 1996, There were "no consensus views about the definition of emotional abuse."

And then proceeds to list three categories that are sufficiently vague to include a lot of legitimate behavior.

Enduring emotional abuse rather than severing ties with the abuser because sometimes cultists sever ties while using that phrase is idiotic.

You don't seem to be getting the concept of "outside view". Think about it this way: as the example of cults shows, humans have a bias that makes them interpret Bob attempting to persuade Alice away from one's meme set as emotional abuse. Consider the possibility that you're also suffering from this bias.

Comment author: Phenoca 06 September 2016 05:37:06PM 1 point [-]

I would say demonization and ostracism count as coercion. Religions use sexual identity shaming, existential fears, 'universal morality', and promises of eternal happiness in an 'afterlife' to fallaciously bring followers under bit & bridle. As soon as a religious authority stoops to the "you're being controlled by evil spirits" argument, it counts as brainwashing. Cult authorities will use this to demonize any and all forms of skepticism, sexual relationships, skipping worship sessions, or interaction with ex-members. Essentially, if you disagree with the head priest, you are going to have some sort of livestock factory farm-esque afterlife full of eternal torment! If that doesn't count as "systematic and often forcible pressure" then I don't know what does. Perhaps enforced chastity and demonization of orgasms..? Psychological coercion is extremely easy, as humans are extremely manipulable, controlled by emotions, and irrational. Add-in a few existential fears, some comforting fallacy, and perhaps some sex appeal, and you've gotten yourself a recruitment platform for your religion.