“Brainwashing”, as popularly understood, does not exist or is of almost zero effectiveness. The belief stems from American panic over Communism post-Korean War combined with fear of new religions and sensationalized incidents; in practice, “cults” have retention rates in the single percentage point range and ceased to be an issue decades ago. Typically, a conversion sticks because an organization provides value to its members.

Some old SIAI work of mine. Researching this was very difficult because the relevant religious studies area, while apparently completely repudiating most public beliefs about the subject (eg. the effectiveness of brainwashing, how damaging cults are, how large they are, whether that’s even a meaningful category which can be distinguished from mainstream religions rather than a hidden inference - a claim, I will note, which is much more plausible when you consider how abusive Scientology is to its members as compared to how abusive the Catholic Church has been etc), prefer to publish their research in book form, which makes it very hard to review any of it. Some of the key citation were papers - but the cult panic was so long ago that most of them are not online or have been digitized! I recently added some cites and realized I had not touched the draft in a year; so while this collection of notes is not really up to my preferred standards, I’m simply posting it for what it’s worth. (One lesson to take away from this is that controlling uploaded human brains will not be nearly as simple & easy as applying classic ‘brainwashing’ strategies - because those don’t actually work.)

Reading through the literature and especially the law review articles (courts flirted disconcertingly much with licensing kidnapping and abandoning free speech), I was reminded very heavily - and not in a good way - of the War on Terror.

Old American POW studies:

  • Clark et al 1981 Destructive Cult Conversion: Theory, Research and Practice
  • Lifton 1961 Thought Reform and the Psychology of Totalism
  • Ross & Langone 1988 Cults: What Parents Should Know
  • Schein, Schneier & Barker 1961 Coercive Persuasion
  • Singer 1978, 1979 “Therapy with Ex-cult Members” Journal of the National Association of Private Psychiatric Hospitals; “Coming Out of the Cults”, Psychology Today

Started the myth of effective brain-washing. But in practice, cult attrition rates are very high! (As makes sense: if cults did not have high attrition rates, they would long ago have dominated the world due to exponential growth.) This attrition claim is made all over the literature, with some example citations being:

  • Barker 1984, 1987 The Making of a Moonie: Choice of Brainwashing?; “Quo Vadis? The Unification Church”, pg141-152, The Future of New Religious Movements
  • Beckford 1981 “Conversion and Apostasy: Antithesis or Complementarity?”
  • Bird & Reimer 1982 “Participation rates in new religious movements and para-religious movements”
  • Robbins 1988 Cults, Converts and Charisma
  • Shupe & Bromley 1980 The New Vigilantes: Deprogrammers, Anticultists and the New Religions
  • Wright & Piper 1986 “Families and Cults: Familial Factors Related to Youth Leaving or Remaining in Deviant Religious Groups”
  • Wright 1983, 1987, 1988 “Defection from New Religious Movements: A Test of Some Theoretical Propositions” pg106-121 The Brainwashing/Deprogramming Controversy; Leaving Cults: The Dynamics of Defection; “Leaving New Religious Movements: Issues, Theory and Research”, pg143-165 Falling from the Faith: Causes and Consequences of Religious Apostasy
  • Wikipedia cites The Handbook of Cults and Sects in America, Hadden, J and Bromley, D eds. (1993)
  • a back of the envelope estimate for Scientology by Steve Plakos in 2000:

    In absolute numbers, that is from 8 million exposed to 150k active current, it means they’ve lost 7,850,000 bodies in the shop. That equates to a Retention Rate of 1.875%. Now, to be fair, over the course of 50 years “X” number of scientologists have dropped their bodies and gone off to Mars, etc,. who might still be members today if they weren’t dead We do not know what the mortality rate is for Scientologists. To significantly impact the RR, there would have to have been a 100% turn over in active membership due to generational shifting. There is no evidence that 150,000 active members of the CofS have died over the past 50 years. Beyond that, we would also need to apply the RR to deceased members to see what number would have continued beyond 15 years. Therefore, using the most favorable membership numbers and not discounting for lose of membership beyond the 15th year, we see a RR of 1.875%+“X”. If we assume that generational shifting accounts for a 10% turnover amongst current membership, that is, that the current membership would be 10% greater had members survived, X would equal 15,000 dead members, or, a total Retained Membership of 165,000. That would give the CofS a 50 year Retention Rate of 2.0625%.

Iannaccone 2003, “The Market for Martyrs” (quasi-review)

From the late-1960s through the mid-1980s, sociologists devoted immense energy to the study of New Religious Movements. [For overviews of the literature, see Bromley (1987), Robbins (1988), and Stark (1985).] They did so in part because NRM growth directly contradicted their traditional theories of secularization, not to mention the sensational mid-sixties claims God was “dead” (Cox 1966; Murchland 1967). NRM’s also were ideal subjects for case stud ies, on account of their small size, brief histories, distinctive practices, charismatic leaders, devoted members, and rapid evolution. But above all, the NRM’s attracted attention because they scared people.

…We have trouble recalling the fear provoked by groups like the Krishnas, Moonies, and Rajneeshees. Their years of explosive growth are long past, and many of their “strange” ideas have become staples of popular culture. [We see this influence not only in today’s New Age and Neo-Pagan movements, but also in novels, music, movies, TV shows, video games, university courses, environmentalism, respect for “cultural diversity,” and the intellectual elite’s broad critique of Christian culture.] But they looked far more threatening in the seventies and eighties, especially after November 18, 1978. On that day, the Reverend Jim Jones, founder of the People’s Temple, ordered the murder of a U.S. Congressman followed by the mass murder/suicide of 913 members of his cult, including nearly 300 children.

The “cults” aggressively proselytized and solicited on sidewalks, airports, and shopping centers all over America. They recruited young adults to the dismay of their parents. Their leaders promoted bizarre beliefs, dress, and diet. Their members often lived communally, devoted their time and money to the group, and adopted highly deviant lifestyles. Cults were accused of gaining converts via deception and coercion; funding themselves through illegal activities; preying upon people the young, alienated, or mentally unstable ; luring members into strange sexual liaisons; and using force, drugs, or threats to deter the exit of disillusioned members. The accusations were elaborated in books, magazine articles, newspaper accounts, and TV drama. By the late-1970s, public concern and media hype had given birth to anti-cult organizations, anti-cult legislation, and anti-cult judicial rulings. The public, the media, many psychologists, and the courts largely accepted the claim that cults could “brainwash” their members, thereby rendering them incapable of rational choice, including the choice to leave. [Parents hired private investigators to literally kidnap their adult children and subject them to days of highly-coercive “deprogramming.” Courts often agreed that these violations of normal constitutional rights were justified, given the victim’s presumed inability to think and act rationally (Anthony 1990; Anthony and Robbins 1992; Bromley 1983; Richardson 1991; Robbins 1985).]

We now know that nearly all the anti-cult claims were overblown, mistaken, or outright lies. Americans no longer obsess about Scientology, Transcendental Meditation, or the Children of God. But a large body of research remains. It witnesses to the ease with which the public, media, policy-makers, and even academics accept irrationality as an explanation for behavior that is new, strange, and (apparently or actually) dangerous.

…As the case stud ies piled up, it became apparent that both the media stereotypes (of sleep-deprived, sugar-hyped, brainwashed automatons) and academic theories (of alienated, authoritarian, neurotics) were far off mark. Most cult converts were children of privilege raised by educated parents in suburban homes. Young, healthy, intelligent, and college educated, they could look forward to solid careers and comfortable incomes. [Rodney Stark (2002) has recently shown that an analogous result holds for Medieval saints - arguably the most dedicated “cult converts” of their day.]

Psychologists searched in vain for a prevalence of “authoritarian personalities,” neurotic fears, repressed anger, high anxiety, religious obsession, personality disorders, deviant needs, and other mental pathologies. The y likewise failed to find alienation, strained relationships, and poor social skills. In nearly all respects - economically, socially, psychologically - the typical cult converts tested out normal. Moreover, nearly all those who left cults after weeks, months, or even years of membership showed no sign of physical, mental, or social harm. Normal background and circumstances, normal personalities and relationships, and a normal subsequent life - this was the “profile” of the typical cultist.

…Numerous studies of cult recruitment, conversion, and retention found no evidence of “brainwashing.” The Moonies and other new religious movements did indeed devote tremendous energy to outreach and persuasion, but they employed conventional methods and enjoyed very limited success. In the most comprehensive study to date, Eileen Barker (1984) could find no evidence that Moonie recruits were ever kidnapped, confined, or coerced (though it was true that some anti-cult “deprogrammers” kidnapped and restrained converts so as to “rescue” them from the movement). Seminar participants were not deprived of sleep; the food was “no worse than that in most college residences;” the lectures were “no more trance-inducing than those given everyday” at many colleges; and there was very little chanting, no drugs or alcohol, and little that could be termed “frenzy” or “ecstatic” experience (Barker 1984). People were free to leave, and leave they did - in droves.

Barker’s comprehensive enumeration showed that among the relatively modest number of recruits who went so far as to attend two-day retreats (claimed to be Moonies’ most effective means of “brainwashing”), fewer than 25% joined the group for more than a week, and only 5% remained full-time members 1 year later. Among the larger numbers who visited a Moonie centre, not 1 in 200 remained in the movement 2 years later. With failure rates exceeding 99.5%, it comes as no surprise that full-time Moonie membership in the U.S. never exceeded a few thousand. And this was one of the most successful cults of the era! Once researchers began checking, rather than simply repeating the numbers claimed by the groups, defectors, or journalists, they discovered dismal retention rates in nearly all groups. [For more on the prevalence and process of cult defection, see Wight (1987) and Bromley (1988).] By the mid-1980s, researchers had so thoroughly discredited “brainwashing” theories that both the Society for the Scientific Study of Religion and the American Sociological Association agreed to add their names to an amicus brief denouncing the theory in court (Richardson 1985).

Singer in particular has been heavily criticized; “Cult/Brainwashing Cases and Freedom of Religion”, Richardson 1991:

Dr. Singer is a clinical psychologist in private practice who earns a considerable portion of her income from cult cases. She has been an adjunct professor at the University of California at Berkeley, but has never held a paid or tenured-track position there. See H. Newton Malony, “Anticultism: The Ethics of Psychologists’ Reactions to New Religions,” presented at annual meeting of the American Psychological Association (New York, 1987) and Anthony, “Evaluating Key Testimony” for more details on Singer’s career.

…The [amicus curiae] brief further claimed that Singer misrepresents the tradition of research out of which terms like “thought reform” and “coercive persuasion” come. She ignores the fact that these earlier studies focus on physical coercion and fear as motivators, and that even when using such tactics the earlier efforts were not very successful. With great facility, Singer moves quickly from situations of physical force to those where none is applied, claiming that these “‘second generation’” thought reform techniques using affection are actually more effective than the use of force in brainwashing people to become members. Thus, Singer is criticized for claiming to stand squarely on the tradition of research developed by scholars such as Edgar Schein and Robert Lifton, while she shifts the entire focus to non-coercive situations quite unlike those encountered in Communist China or Korean prisoner of war camps. The brief points out, as well, that Singer ignores a vast amount of research supporting the conclusion that virtually all who participate in the new religions do so voluntarily, and for easily understandable reasons. No magical “black box” of brainwashing is needed to explain why significant numbers of young people chose, in the 1960s and 1970s, to abandon their place in society and experiment with alternative life styles and beliefs. Many youth were leaving lifestyles that they felt were hypocritical, and experimenting with other ways of life that they found to be more fulfilling, at least temporarily. Particularly noteworthy, but ignored by Singer, are the extremely high attrition rates of all the new religions. These groups are actually very small in numbers (the Hare Krishna and the Unification Church each have no more than two to three thousand members nationwide), which puts the lie to brainwashing claims. If “brainwashing” practiced by new religions is so powerful, why are the groups experiencing so much voluntary attrition, and why are they so small?

…Considerable research reported in refereed scholarly journals and other sources supports the idea that the new religions may be serving an important ameliorative function for American society. The groups may be functioning as “half-way houses” for many youth who have withdrawn from society, but still need a place to be until they decide to “return home.” Participation in some new religions has been shown to have demonstrable positive effects on the psychological functioning of individuals, a finding that Singer refuses to acknowledge.

“Overcoming The Bondage Of Victimization: A Critical Evaluation of Cult Mind Control Theories”, Bob and Gretchen Passantino Cornerstone Magazine 1994:

Neither brainwashing, mind control’s supposed precursor, nor mind control itself, have any appreciable demonstrated effectiveness. Singer and other mind control model proponents are not always candid about this fact: The early brainwashing attempts were largely unsuccessful. Even though the Koreans and Chinese used extreme forms of physical coercion as well as persuasive coercion, very few individuals subjected to their techniques changed their basic world views or commitments. The CIA also experimented with brainwashing. Though not using Korean or Chinese techniques of torture, beatings, and group dynamics, the CIA did experiment with drugs (including LSD) and medical therapies such as electroshock in their research on mind control. Their experiments failed to produce even one potential Manchurian Candidate, and the program was finally abandoned.

Although some mind control model advocates bring up studies that appear to provide objective data in support of their theories, such is not the case. These studies are generally flawed in several areas: (1) Frequently the respondents are not from a wide cross-section of ex-members but disproportionately are those who have been exit-counseled by mind control model advocates who tell them they were under mind control; (2) Frequently the sample group is so small its results cannot be fairly representative of cult membership in general; (3) It is almost impossible to gather data from the same individuals before cult affiliation, during cult affiliation, and after cult disaffection, so respondents are sometimes asked to answer as though they were not yet members, or as though they were still members, etc. Each of these flaws introduces unpredicatiblity and subjectivity that make such study results unreliable…The evidence against the effectiveness of mind control techniques is even more overwhelming. Studies show that the vast majority of young people approached by new religious movements (NRMs) never join despite heavy recruitment tactics. This low rate of recruitment provides ample evidence that whatever techniques of purported mind control are used as cult recruiting tools, they do not work on most people. Even of those interested enough to attend a recruitment seminar or weekend, the majority do not join the group. Eileen Barker documents [Barker, Eileen. New Religious Movements: A Practical Introduction. London: Her Majesty’s Stationery Office, 1989.] that out of 1000 people persuaded by the Moonies to attend one of their overnight programs in 1979, 90% had no further involvement. Only 8% joined for more than one week, and less than 4% remained members in 1981, two years later:

. . . and, with the passage of time, the number of continuing members who joined in 1979 has continued to fall. If the calculation were to start from those who, for one reason or another, had visited one of the movement’s centres in 1979, at least 999 out of every 1,000 of those people had, by the mid-1980s, succeeded in resisting the persuasive techniques of the Unification Church.

Of particular importance is that this extremely low rate of conversion is known even to Hassan, the best-known mind control model advocate whose book [Hassan, Steven. Combatting Cult Mind Control. Rochester, VT: Park Street Press, 1990?] is the standard text for introducing concerned parents to mind control/exit counseling. In his personal testimony of his own involvement with the Unification Church, he notes that he was the first convert to join at the center in Queens; that during the first three months of his membership he only recruited two more people; and that pressure to recruit new members was only to reach the goal of one new person per member per month, a surprisingly low figure if we are to accept the inevitable success of cult mind control techniques.

Objection: High Attrition Rates Additionally, natural attrition (people leaving the group without specific intervention) was much higher than the self-claimed 65% deprogramming success figure! It is far more likely a new convert would leave the cult within the first year of his membership than it is that he would become a long term member.

Gomes, Unmasking the Cults (Wikipedia quote):

While advocates of the deprogramming position have claimed high rates of success, studies show that natural attrition rates actually are higher than the success rate achieved through deprogramming

“Psychological Manipulation and Society”, book review of Spying in Guruland: Inside Britain’s Cults, Shaw 1994

Eventually Shaw quit the Emin group. Two months later he checked in with some Emin members at the Healing Arts Festival, a psychic fair. He avoided many Emin phone invitations for him to attend another meeting. He discovered that most, if not all, of the people who joined with him had dropped out. This is consistent with what Shaw has noted about most cults and recruits: the dropout rate is high.

Anthony & Robbins 1992, “Law, Social Science and the ‘Brainwashing’ Exception to the First Amendment”:

Lifton and Schein are also characterized in Molko (54) as attesting to the effectiveness of brainwashing, although Schein, an expert on Chinese coercive persuasion of Korean War POWs, actually thought, as do a number of scholars, that the Chinese program was relatively ineffective (Schein, 1959, p. 332; see also Anthony, 1990a; Schefiin & Opton, 1978)…Schein appears to actually have considered the communist Chinese program to be a relative “failure” at least, “considering the effort devoted to it” (Schein, 1959, p. 332; Anthony, 1990a, p. 302)…Various clinical and psychometric studies of devotees of well-known “cults” (Ross, 1983; Ungerleider & Wellisch, 1979) have found little or no personality disorder or cognitive impairment.

  • Ross 1983. “Clinical profile of Hare Krishna devotees”, American Journal of Psychiatry
  • Schein, E. (1959). “Brainwashing and totalitarianization in modern society”. World Politics, 2,430441.
  • Ungerleider, T., & Wellisch, D. K (1979). “Coercive persuasion (brainwashing), religious cults, and deprogramming”. American Journal of Psychiatry , 136,3,279-82.

“Brainwashed! Scholars of cults accuse each other of bad faith”, by Charlotte Allen, Lingua Franca Dec/Jan 1998:

Zablocki’s conversion to brainwashing theory may sound like common sense to a public brought up on TV images of zombielike cultists committing fiendish crimes or on the Chinese mind control experiments dramatized in the 1962 film The Manchurian Candidate. But among social scientists, brainwashing has been a bitterly contested theory for some time. No one doubts that a person can be made to behave in particular ways when he is threatened with physical force (what wouldn’t you do with a gun pressed to your head?), but in the absence of weapons or torture, can a person be manipulated against his will?

Most sociologists and psychologists who study cults think not. For starters, brainwashing isn’t, as Zablocki himself admits, “a process that is directly observable.” And even if brainwashing could be isolated and measured in a clinical trial, ethical objections make conducting such a test almost unthinkable. (What sort of waivers would you have to sign before allowing yourself to be brainwashed?) In the last decade, while brainwashing has enjoyed a high profile in the media-invoked to explain sensational cult disasters from the mass suicide of Heaven’s Gate members to the twelve sarin deaths on the Tokyo subway attributed to the Aum Shinrikyo cult-social scientists have shunned the the term as a symptom of Cold War paranoia and anticult hysteria. Instead, they favor more benign explanations of cult membership. Alternatives include “labeling” theory, which argues there is simply nothing sinister about alternative religions, that the problem is one of prejudicial labeling on the part of a mainstream culture that sees cult members as brainwashed dupes, and “preexisting condition” theory, which posits that cult members are people who are mentally ill or otherwise maladjusted before they join. (A couple of scholars have even proposed malnutrition as a preexisting condition, arguing that calcium deficiency may make people prone to charismatic susceptibility.)

Thus, when Zablocki published an indignant 2-part, 60-page defense of brainwashing theory in the October 1997 and April 1998 issues of Nova Religio, a scholarly journal devoted to alternative belief systems, he ignited a furor in the field. Pointing to the “high exit costs” that some cults exacted from those who tried to defect-shunning, forfeiture of parental rights and property, and veiled threats-Zablocki argued that these were indications of brainwashing, signs that some groups were using psychological coercion to maintain total control over their members. Although he admitted he could not prove brainwashing empirically, he argued that at the very least brainwashing should not be dismissed out of hand.

…Zablocki’s colleagues were unimpressed. In a response also published in Nova Religio, David Bromley, a sociologist at Virginia Commonwealth University who has studied the Reverend Sun Myung Moon’s Unification Church, complained that in Zablocki’s formulation brainwashing remained a vague, slippery, limiting, and ultimately untestable concept. Moreover, he pointed out, cults typically have low recruitment success, high turnover rates (recruits typically leave after a few months, and hardly anyone lasts longer than two years), and short life spans, all grounds for serious skepticism about the brainwashing hypothesis. Even if you overlook these facts, Bromley added, “the extraordinarily varied cultural origins, patterns of organizational development, and leadership styles of such groups pose a problem in explaining how they seem to have discovered the same ‘brainwashing’ psycho-technology at almost precisely the same historical moment.” A quick survey of the field reveals that Bromley is far from being the only doubter. Eileen Barker, a sociologist at the London School of Economics who has also studied the Unification Church, says, “People regularly leave the Moonies of their own free will. The cults are actually less efficient at retaining their members than other social groups. They put a lot of pressure on them to stay in-love-bombing, guilt trips-but it doesn’t work. They’d like to brainwash them, but they can’t.”

…To further complicate matters, researchers often bring very different, even conflicting approaches to their work. Psychologists, for example, tend to emphasize how a repeated environmental stimulus can elicit a conditioned response-depriving subjects of their autonomy. Sociologists, by contrast, typically endorse a voluntarist conversion model for religion, which posits that people join cults for generally rational reasons connected to the group’s ability to satisfy their needs: for a transcendent theology; for strong bonds of kinship and solidarity; for enough social support to enable them to quit drugs or otherwise turn their personal lives around. (For example, one study has shown that schizophrenics who joined cults functioned better than those who tried drugs or conventional psychotherapy.)

…In 1980 the New York state legislature, over objections from the American Civil Liberties Union, passed a bill that would have legalized deprogramming (it was vetoed by Governor Hugh Carey). “With deprogramming-with parents having their children abducted and held captive-the whole thing became intensely emotional,” says Thomas Robbins. “Who were the kidnappers: the parents, the cults, or the police? There were hard feelings on both sides.” Among the most outraged were social scientists who had never believed that people could be brainwashed into joining cults and who, as good civil libertarians, were appalled by deprogramming. Ofshe and Singer’s scholarly testimony (and fat fees) distressed a number of these scholars, whose credentials were equally respectable and whose own research had led them to conclude that coercive persuasion was impossible in the absence of some sort of physical coercion such as prison or torture.

…Zablocki made another, potentially more damning charge, however-one that Robbins did not take up. A significant amount of cult money, he wrote, has gone to scholars-in support of research, publication, conference participation, and other services. Zablocki did not name names. But a number of professors freely admit that nontraditional religions (in most cases, the Unificationists and Scientologists) have cut them checks. The list includes some of the most prominent scholars in the discipline: Bromley, Barker, Rodney Stark of the University of Washington, Jeffrey Hadden of the University of Virginia, and James Richardson, a sociologist of religion at the University of Nevada at Reno. All five have attended cult-subsidized conferences, and Bromley, Hadden, and Richardson have occasionally testified in court on behalf of cults or offered their services as expert witnesses against brainwashing theory. “This is an issue,” Zablocki wrote sternly, “of a whole different ethical magnitude from that of taking research funding from the Methodists to find out why the collection baskets are not coming back as heavy as they used to.”

New Comment
106 comments, sorted by Click to highlight new comments since: Today at 10:15 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

An interesting question is, given the general failure of brainwashing, how do new religions manage to take hold, like Christianity, Islam, Mormonism, Sikhism, etc.? How come Christian and Islamic proselytism has been so consistently successful in many parts of the world?

These are good questions, and as you can imagine, much debated in the literature, with explanations ranging from acts of God (the paradigmatic example being the Jew quoted in Acts of the Apostles arguing that Christianity didn't need to be suppressed because if it flourished, it must be favored by God, and it would fail if it was disfavored by him) to enabling effective society coordination (particularly attractive for Islam: the horsebacked nomads managed to coordinate under Mohammed rather than feud, and did as well as the Mongols, with conversion then following from personal advantage and to escape dhimmitude) to arguments that it's just random drift (Carrier points out that the best estimates of the sizes of early Christianity as tiny even centuries after Jesus then necessarily imply that the annual growth rate must have been far tinier than commonly assumed).

My uneducated guess is that it is because Christianity, Islam, Sikhism, Buddhism, and Judaism were all backed by governments and military forces during the initial stages of expansion. I don't believe there are any large religions for which this is not true - Hinduism is too old for us to say much about its origins, but there was a time when Buddhism was becoming extremely popular, and power was involved in re-establishing Hinduism.

If I'm right, then the thing that causes small memeplexes to become big memeplexes is the successful conversion of a few powerful and influential people (and that process happens through random drift in the case of religion)

Also, I think Christianity, Islam, and Judaism are the only religions which care about whether or not you believe them. (As in, members think that belief itself has consequences and so they aught to care what others believe). It's harder to leave these religions, with shadows of hell hanging over you. I think that in most other religions, people can sort of vaguely redirect worship from one set of symbols to another without really rejecting old beliefs and accepting new ones in a way that is consistent with "brainwashing" - ... (read more)

Typically, a conversion sticks because an organization provides value to its members.

People do get value from religion. The big two seem to be social conformity and fear of death, but there are others. The only atheist that I personally know who converted to Christianity got a wife out of the deal.

5name994y
The general answer seems to be that religions (just like "total" political parties) provide value for money, in particular a social environment. Friends, baby sitters, group activities, help when you lose a job or someone dies. I think academics, in particular, tend to be such loners, and to be content with such social support as is provided by the government, that they radically underestimate how hungry people are for this sort of social interaction.

My model of a cult is a mechanism that exploits various flaws in human thinking. For example, peer pressure turned up to eleven: if you leave a cult, you lose all your friends at the same moment. (In real life, one's friends are usually not this coordinated.) The cult keeps you busy with all the cultish stuff, which makes the natural procrastination about important decisions (such as leaving the cult) even stronger. There is the initial "love bombing", which prevents you from estimating how happy you would be if you joined. Etc.

Typically, a conversion sticks because an organization provides value to its members.

Disagree connotationally. (Also I am not sure what "conversion sticks" means precisely. If a person spends 5 or 10 years in a cult and then leaves, do we consider those initial years as a success, because the person did not run away immediately?) Yes, technically, the organization provides something, but if we want to get more specific, it usually provides promises that something very good will happen in the unspecified - but very close - future. It also provides something immediately, for example a social group, but I think those promises are very important for many people. So if we define "value" as promises that sound good but are never fulfilled, then yes, the organization provides the value to its members. But if we define "value" as the thing that was promised, then it does not really provide the value.

Until I read this, I didn't realize there are different possible claims about the dangers of cults. One claim-- the one gwern is debunking-- is that cults are a large-scale danger, and practically anyone can be taken over by a cult.

The other less hyperbolic claim is that cults can seriously screw up people's lives, even if it's a smallish proportion of people. I still think that's true.

This is an excerpt from Valerie Tarico's web series "Christian Belief Through The Lens of Cognitive Science"

In revival meetings or retreats, semi-hypnotic processes draw a potential convert closer to the toggle point. These include including repetition of words, repetition of rhythms, evocative music, and Barnum statements (messages that seem personal but apply to almost everyone– like horoscopes). Because of the positive energy created by the group, potential converts become unwitting participants in the influence process, actively seeking to make the group’s ideas fit with their own life history and knowledge. Factors that can strengthen the effect include sleep deprivation or isolation from a person’s normal social environment. An example would be a late night campfire gathering with an inspirational story-teller and altar call at Child Evangelism’s “Camp Good News.”

These powerful social experiences culminate in conversion, a peak experience in which the new converts experience a flood of relief. Until that moment they have been consciously or unconsciously at odds with the group center of gravity. Now, they may feel that their darkest secrets are known and fo

... (read more)
Nuremberg rallies - Wikipedia
One officer says to the other officer: "Mind control doesn't exist. You should check out the psychology research."

Very interesting and surprising. A priori I would have expected the most successful NRMs to be at least 10-20% effective in one-year new member retention. I wonder how non-religious non-mainstream organizations that demand some amount of sacrifice from their members measure up? E.g. what are the retention rates in online forums, gaming communities, fitness centers, etc...?

Cultist, n. One who is obstinately and zealously attached to an opinion that you do not entertain.

(That's actually "bigot" in the Devil's Dictionary, but cultist is a better fit to me.)

3wedrifid11y
In the translation from 'bigot' to 'cultist' we could perhaps add "or group you do not approve of".

Has the rehabilitation of 'cults' begun? "The Cult Deficit", Ross Douthat:

LIKE most children of the Reagan era, I grew up with a steady diet of media warnings about the perils of religious cults — the gurus who lurked in wait for the unwary and confused, offering absolute certainty with the aftertaste of poisoned Kool-Aid. From the 1970s through the 1990s, from Jonestown to Heaven’s Gate, frightening fringe groups and their charismatic leaders seemed like an essential element of the American religious landscape. Yet we don’t hear nearly as much

... (read more)

My observation about cults, from personal experience leading them, is that they are a totally normal mode of human operation. People are always looking for strong leaders with vision, passion and charisma who can organize them for a larger purpose. What distinguishes a cult from a non-cult is that they are outside the norms of the mainstream society (as established by the dominant cults -- i.e. "the culture"). "Cult", "brainwashing", "deprogramming", etc. are terms of propaganda used by the dominant culture to combat... (read more)

My observation about cults, from personal experience leading them

* raises eyebrow *

6niceguyanon11y
From BrotherNihil's website: He wasn't kidding about the personal experience. Heh... goodluck with that here on LW. I take the crackpottery of his site as evidence to not take much of what he says seriously.
8Flaglandbase2y
J.K. Rowling could probably manipulate Lesswrong as she sees fit by buying the site, shadowbanning all commenters, and putting up new comments using their names (but preventing the real users from seeing these) were they slowly become convinced witchcraft real.
4Viliam_Bur11y
There is something like manipulation. To make this a discussion about anticipated experience, here is an experiment proposal: Kidnap a few new members from different religious organizations. (It's just an imaginary experiment.) Keep them for one week isolated from their religious groups: no personal contact, no phone, no books. If they start to do some rituals they were told to do, for example repeat a mantra or sing a song, prevent them from doing so. Otherwise, don't do them any harm, and keep them in a nice environment. -- When the week is over, just let them go. Observe how many of them return to the original group. Compare with a control group of randomly selected people you didn't kidnap; how many of them remained in the group after the week. Are there statistically significant differences for different religious groups? My prediction is that there would be observable differences for different religious groups. I believe there is some pressure involved in the process of recruitment in some religious (or not just religious) groups; some algorithm which increases the chances of membership when done properly, and fails when interrupted. Perhaps "brainwashing" is too strong word, but it a kind of manipulation. It consists of pushing the person towards more expressions of commitment, without giving them time to reflect whether they really want it (whether it is okay with their other values).
5gwern11y
This is pretty similar to what the deprogrammers did. They didn't have too high success rates.
4[anonymous]11y
People like to resist coercion. Reactions to being kidnapped in order to be forced to abandon the cult could be different than reactions to being kidnapped and held for a week by a mad psychologist with a mysterious agenda. Though for the agenda to be mysterious, the idea of preventing them from engaging in rituals would have to be abandoned.

Taboo "brainwashing".

What does Christianity, for instance, succeed in doing, if not brainwashing?

It seems to me that it's (sincere) adherents have been persuaded to believe Christianity is the most rational choice. They've been convinced it is the best wager available.

Is it? Is Christianity (or any religion) the best wager? Is it rational?

If not, then what can we say about the mechanism(s) used to get humans to be wholly convinced otherwise? What shall we name it? How does it work?

And how is this yet-nameless, magical process different from brainwashing?

6Viliam_Bur11y
Religions succeed in making people believe in them. But how specifically? I propose three mechanisms: First, by providing them some advantages. There may be advantages of the belief (belief in afterlife alleviates fear of death), advantages of explicitly commanded behavior (forbidding theft reduces costs of protecting property), other advantages (meeting every Sunday in church helps one meet their neighbors). Second, by blackmailing them. If you leave the religion, you will be forever tortured in the hell; and in some situations your former friends will murder you. Third, by modifying their thoughts or behavior into ones that make leaving less likely. For example teaching people that things can be "true" even if they are completely invisible and statistically undetectable (which makes them less likely to realize that the beliefs are wrong), by making them too busy to make any long-term thinking (such as whether to leave the religion), by removing information sources or friends that could provide information or encouragement against the religion. If this reflects the reality well enough, I would suggest that the word "brainwashing" means using mainly the third and second kind of mechanisms (as opposed to mostly the first one, which feels legitimate). One religious group can give people friends and pleasant social activities, so the members are happy to be there. Other religious group can make them busy 16 hours a day (praying, meditating, converting new people, etc.) and destroy all out-group contacts, so the members' agency is crippled, and they stay even if they are unhappy.

In Jin, Moon and his wife’s fourth child, seemed suited for the task. She had a modern American upbringing and a master’s degree from Harvard. In 2009, she took over the Unification Church of America and introduced a bold modernization program. Her aim, she said, was to transform the church into one that people—especially young people—were “dying to join.” She renamed the church Lovin’ Life Ministries, shelved the old hymn books, and launched a rock band, an offshoot of which played New York clubs under the moniker Sonic Cult. She also discarded the old K

... (read more)
1gjm7y
This seems like evidence against (a perhaps overstrong version of) the thesis of the OP, namely that cult "techniques" are ineffective. But note that * it's perfectly consistent with them not being scarily effective; and * it's also possible that these changes made no difference to (or even increased) the Moonies' ability to acquire new members and keep them in the short term, and that it cut their membership because longstanding members who were used to the old way of doing things hated the reforms.

It always seemed obvious to me that cults have rather low conversion rates.

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

(I tend to delineate cults/non-cults on the basis of how they resolve this trade-off between extremism and popularity)

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

No one in the literature suggests this, and cults (just like mainstream religions such as Mormonism) invest enormous efforts into proselytization, rather than strenuous filtering of existing converts. The efforts just don't succeed, and like the Red Queen, minority religions need to run as fast as they can just to stay in place.

3private_messaging11y
The low rate of retention is extreme filtering. The cults try to get members to sever ties with the family and friends, for example - and this is a filter, most people get creeped out and a few go through with it. edit: and of course, with such extreme filtering, one needs a lot of proselytism to draw just a hundred very dedicated supporters.
7gwern11y
You are arguing by definition here; please consider what could falsify your mental model of cults. If my local gym discovers only 1% of the people joining after New Years will stick around for more than a year, does that necessarily imply that the gym is ruled by a charismatic leader driving people away so as to maximize the proportion of unthinkingly loyal subordinates? Low rate of retention is simply low rate of retention. This can be for a great many reasons, such as persecution, more attractive rival organizations, members solving their problems and leaving, or (way down the list) extreme filtering for loyalty which drives away otherwise acceptable members. How often do you see a cult leader going 'well, sure, we could have thousands more members if we wanted (people are pounding down the doors to convert), and majorly increase our donations and financial holdings, but gosh, we wouldn't want to sell out like that!' Of course, like any organization, there's concerns about freeriding and wasting club goods and it'll seek to strike a balance between inclusiveness and parasite load; but a cult which has 'successfully' shed all but a few fanatics is a cult which is about to become history. Recruiting through family and friends is a major strategy of cults - indeed, perhaps the only strategy which does not have abysmally low success rates.
0private_messaging11y
Low rate of retention is a product of many reasons simultaneously, including the extreme weird stuff creeping people out. If your local gym is creepy, it will have lower retention rate, than same gym that is not creepy. My mental model of failed retention includes the general low retention rate, in combination with the weird things that cult does creeping people out, on top of that. I rarely see people reflect on their motives or goal structure. You often see a cult leader abusing a cultist, which leads insufficiently dedicated cultists to leave. Such actions sacrifice quantity for "quality". Yes, and a lot of the time that fails, and the family members start actively denouncing the cult, and the member has to choose between the family and friends, and the cult, at which point, well, few choose the cult.
1gwern11y
As pointed out in the OP by one author, the cults in question have in many ways been assimilated by the mainstream and so are far less 'weird' than ever before. Has that helped their retention rates? Environmentalism and meditation are completely mainstream now, have the Hare Krishnas staged a comeback? The counterfactual is not available or producible, and so this is meaningless to point out. If the Hare Krishnas did not hold 'creepy' beliefs, in what sense is this counterfactual organization similar to the Hare Krishnas? If Transcendental Meditators did not do as weird a thing as meditate, how are they Transcendental Meditators? Defining away all the unique characteristics does not add any insight. "You often see a boss abusing a subordinate, which leads insufficiently dedicated employees to leave. This is because bosses wish to sacrifice quantity and being able to handle work for 'quality' of subordinates." No, there is nothing unique about cults in this respect. Monkeys gonna monkey. And for the exact same reason businesses do not casually seek to alienate 99% of their employees in order to retain a fanatical 1%, you don't see cults systematically organization-wide try to alienate everyone. You see a few people in close proximity to elites being abused. Just like countless other organizations. Which explains the success of deprogrammers, amirite?
-1Jiro11y
I would suggest that if beliefs believed by cults becoime mainstream, that certainly decreases one barrier to such a cult's expansion, but because there are additional factors (such as creepiness) that alone is not enough to lead the cult to expand much. It may be that people's resistance to joining a group drastically increases if the group fails any one of several criteria. Just decrementing the number of criteria that the group fails isn't going to be enough, if even one such criterion is left. The level of abuse done by bosses and cult leaders is different, so although the statement is literally true for both bosses and cult leaders, it really doesn't imply that the two situations are similar.
5gwern11y
Maybe, but I don't know how we'd know the difference. Is it really? Remember how many thousands of NRMs there are over the decades, and how people tend to discuss repeatedly a few salient examples like Scientology. Can we really compare that favorably regular bosses with religious figures? Aside from the Catholic Church scandal (with its counterparts among other closemouthed groups like Jewish and Amish communities), we see plenty of sexual scandals in other places like the military (the Tailhook scandal as the classic example, but there's plenty of recent statistics on sexual assault in the military, often enabled by the hierarchy).
-1private_messaging11y
I don't see how environmentalism or for that matter meditation itself is creepy. What's creepy about Hare Krishnas is the zoned out sleep deprived look on the faces (edit: I am speaking of the local ones, from experience), and the whole obsession with the writings of the leader thing, and weirdly specific rituals. Now that environmentalism and meditation are fairly mainstream, you don't have to put up with the creepy stuff if you want to be around people who share your interests in environmentalism and meditation. You have less creepy alternatives. You can go to a local Yoga class, that manages to have same number of people attending as the local Khrishna hangout, despite not trying nearly as hard to find new recruits. You can join a normal environmentalist group. The difference is, of course, in extent. For example, putting up a portrait of the founder at every workplace (or perhaps in a handbook, or the like) would be something that a cult leader would do in a cult, but what a corporation would seldom ever do because doing so would be counter-productive. edit: actually. What do you think makes joining a cult worse than joining a club, getting a job, and so on? Now, what ever that is, it makes it harder to get new recruits, and requires more dedication.
5gwern11y
Which goes to show how far into the zeitgeist they've penetrated. Go back to the 1960s when the cult panic and popular image of cults was being set, and things were quite different. One of the papers discusses a major lawsuit accusing the Hare Krishnas of 'brainwashing' a teen girl when she ran away from home and stayed with some Krishnas; the precipitating event was her parents getting angry about her meditating in front of a little shrine, and ripping it out and burning it (and then chaining her to the toilet for a while). To people back then, 'tune in, turn on, drop out' sounds less like a life choice than a threat... Well, I can hardly argue against your anecdotal experiences. Supreme Court - jurists or cultists? Film at 11. We report, you decide. I don't even know what 'weirdly specific' would mean. Rituals are generally followed in precise detail, right down to the exact repetitive wording and special garments like Mormon underpants; that's pretty much what distinguishes rituals from normal activities. Accepting Eucharist at mass? Ritual. Filling out a form at the DMV? Not ritual. Hmm, where was one to find yoga back then... Ah yes, also in cults. Ashrams in particular did a lot of yoga. Interesting that you no longer have to go to an ashram or fly to India if you want to do yoga. It's almost like... these cult activities have been somehow normalized or assimilated into the mainstream... And where did these environmentalist groups come from? Really? That seems incredibly common. Aside from the obvious examples of many (all?) government offices like post offices including portraits of their supreme leader - I mean, President - you can also go into places like Walmart and see the manager's portrait up on the wall. Personally? I think it's mostly competition from the bigger cults. Just like it's hard to start up a business or nonprofit.
2Luke_A_Somers11y
I wasn't around in the 60s and wasn't aware for any of the 70s, but... Environmentalism seems qualitatively different from everything else here. Is there some baggage to this beyond, say, conservation, or assigning plants and animals some moral weight, that is intended here? Something may have seemed weirder in the past because it was weirder back then. I suspect few modern Christians would sign up for AD 200 Christianity.
4gwern11y
Not really, aside from the standard observation that you can just as easily play the 'find cult markers' game with environmental groups like Greenpeace or ELF. Cleansing rituals like recycling, intense devotion to charismatic leaders, studies of founding texts like Silent Spring, self-abnegating life choices, donating funds to the movement, sacralization of unusual objects like owls or bugs, food taboos ('GMOs'), and so on and so forth.
-6private_messaging11y
0ChristianKl11y
I'm not sure whether that's true. You have people on LessWrong talking about cutting family ties with nonrational family members and nobody get's creeped out. I don't think I have ever witnessed people getting creeped out by such discussions in the self help area and I think I have frequently heard people encouraging others to cut ties with someone that "holds them back".

Really? Links? A lot of stuff here is a bit too culty for my tastes, or just embarassing, but "cutting family ties with nonrational family members"?? I haven't been following LW closely for a while now so I may have missed it, but that doesn't sound accurate.

3Douglas_Knight11y
Here's an example.
9Mestroyer11y
diegocaleiro didn't just say they were just irrational: I strongly suspect that this isn't a case of "My family members don't believe as I do, therefore fuck those guys." but rather "These family members know that I am nonreligious and aggressively proselytize because of it." This probably isn't even about rationality or LessWrong, rather atheism. Note also that it is diegocaleiro who initiated the conversation, and note the level of enthusiasm about the idea received from other posters (Only ChristianKI and Benito's responses seem wholly in favor, Villiam_Bur and drethelin's responses are against, shminux and Ben_LandauTaylor's responses are neutral).
3Eugine_Nier11y
Outside view: These family members know that [diegocaleiro joined a group with weird non-mainstream religious beliefs] and [are trying to deconvert him].
7yli11y
Thanks for the link. I don't really see creepy cult isolation in that discussion, and I think most people wouldn't, but that's just my intuitive judgment.
8ChristianKl11y
That's the point. It doesn't look that way from the inside. If someone would tell those family members that the OP cutted their family ties with them because he made a rational analysis with help from his LessWrong friends those family member might see it as an example of the evil influence that LessWrong has on people.
9Costanza11y
I'm at least mildly creeped out by occasional cultish behavior on LessWrong. But every cause wants to be a cult Eliezer said so, so therefore it is Truth.
2wedrifid11y
I do not believe you. If it is the case that people talk about cutting family ties with 'nonrational family members' then there will be people creeped out by it. Note that if the 'nonrational' family members also happen to be emotionally abusive family members this would not match the criteria as I interpret it. (Even then I expect some people to be creeped out by the ties cutting and would expect myself to aggressively oppose such expressions so as to suppress a toxic influence.)
5Eugine_Nier11y
You do realize that a lot of cults tend to classify normal family reactions, e.g., attempting to get the person out of the cult, as emotional abuse.
7wedrifid11y
I don't care and I'm somewhat outraged at this distortion of reasoning. It is so obviously bad and yet remains common and is all too seldom refuted. Emotional abuse is a sufficiently well defined thing. It is an undesirable thing. Various strategies for dealing with it are possible. In severe cases and in relationships where the gains do not offset the damage then severing ties is an appropriate strategy to consider. This doesn't stop being the case if someone else also misuses the phrase 'emotional abuse'. Enduring emotional abuse rather than severing ties with the abuser because sometimes cultists sever ties while using that phrase is idiotic. Calling people 'creepy' for advocating sane, mainstream interpersonal strategies is absurd and evil.
5Kaj_Sotala11y
Sorry, exactly what is it that you're outraged about? Eugene seemed to merely be pointing out that people inside particular social groups might see things differently than people outside them, with the outsiders being creeped out and insiders not being that. More specifically, that things that we deem okay might come off as creepy to outsiders. That seems correct to me.
1wedrifid11y
As a general policy: * All cases where non-sequitur but technically true claims are made where the actual implied rhetorical meaning is fallacious. Human social instincts are such that most otherwise intelligent humans seem to be particularly vulnerable to this form of persuasion. * All arguments or insinuations of the form "Hitler, Osama Bin Laden and/or cultists do . Therefore, if you say that is ok then you are Bad." * Additional outrage, disdain or contempt applies when: * The non-sequitur's are, through either high social skill or (as in this case) plain luck, well calibrated to persuade the audience despite being bullshit. * Actual negative consequences can be expected to result from the epistemic damage perpetrated.
8Kaj_Sotala11y
Thanks, that sounds reasonable. I didn't interpret Eugene's comments as being guilty of any of those, though.
-4Eugine_Nier11y
In my experience nearly all accusations that someone is being "emotionally abusive" are of this type.
4wedrifid11y
If that is true then you are fortunate to have lived such a sheltered existence. If it is not true (and to some extent even if it is) then I expect being exposed to this kind of denial and accusation of dishonesty to be rather damaging to those who are actual victims of the phenonemon you claim is 'nearly all' fallacious accusation.
-3Eugine_Nier11y
I could say the same thing about you if you've never encountered people willing to make false accusations of abuse (frequently on behalf of children) with the force of the law, or at least child services behind them. This is as good a summery of the "how dare you urge restraint" position as any I've heard.
4Eugine_Nier11y
So could you provide a definition. The article you linked to begins by saying: And then proceeds to list three categories that are sufficiently vague to include a lot of legitimate behavior. You don't seem to be getting the concept of "outside view". Think about it this way: as the example of cults shows, humans have a bias that makes them interpret Bob attempting to persuade Alice away from one's meme set as emotional abuse. Consider the possibility that you're also suffering from this bias.
-3wedrifid11y
Yes, but I do not believe this to be necessary or appropriate at this time. The sincere reader is invited to simply use their own definition in good faith. The precise details do not matter or, rather, are something that could be discussed elsewhere by interested parties or on a case by case basis. For now I will say this is an example of emotional abuse which would in most situations call for the severing of ties. Other cases are less clear but, again, can be argued about when they crop up. Don't be absurd. Conversation over. Be advised that future comments of your on any of the subjects of emotional abuse, cults or creepiness will be voted on without reply unless I perceive them to be a danger to others. The reasoning you are using is both non-sequitur and toxic. I don't have the patience for it. I don't care about evangalism. I care about gaslighting, various forms of emotional blackmail and verbal abuse. Again, the fact that the phrase "emotional abuse" can be misused by someone in a cult does not make refusal to respond to actual emotional abuse appropriate or sane. To whatever extent your 'outside' view cannot account for that your outside view is broken.
-5Eugine_Nier11y
-6private_messaging11y

I wonder what percentage of adult North Koreans have been successfully brainwashed by their government to the extent that, say, they believe that their country's founding dictator was one of the greatest forces for good the world has ever known. What's your estimate?

[pollid:553]

In the Korean context, surveys have been done of defectors (for the obvious reasons) to try to gauge the current level of support for the regime. The result is sadly predictable for anyone who's seen Russians nostalgic for Stalin or Chinese wistfully thinking back to Mao: Il-Sung is still venerated by many North Koreans, even if they don't like his son or despise the pig-grandson.

Some survey data is summarized in The Hidden People of North Korea: Everyday Life in the Hermit Kingdom and "An Assessment of the North Korean System's Durability" is an extensive discussion of defector surveys. (Apparently in the 2002 defector survey, 67% of them believed their countrymen venerated Il-Sung as the "greatest mind of humanity". Many interesting bits, like "Few North Koreans seem aware that the United States has been one of North Korea's principal food donors.")

8gwern10y
From a new paper, "Preparing for the Possibility of a North Korean Collapse", Bennett 2013 (RAND):
5DanArmak11y
And that's just for defectors, which must be a selection effect in favour of being against Il-Sung.

Note that the survey says that they believe that their [i]countrymen[/i] venerated Il-Sung. Defectors may be likely to dislike Il Sung themselves, but my (low certainty) expectation would be that they'd be more likely to see the population at large as slavishly devoted. People who take an unusual stance in a society are quite likely to caricature everyone else's position and increase the contrast with their own. Mind you, they sometimes take the 'silent majority' thing of believing everyone secretly agrees with them: I don't know which would be more likely here.

But I'd guess that defectors would be both be more likely to think everyone else is zealously loyal, AND be more likely to believe that everyone wishes they could overthrow the government. I'd imagine them to be more likely to end up on the extremes, in short.

Not sure what the purpose of this poll is. Brainwashing from birth with little or no exposure to alternative views is a quite different environment from the one NRMs operate in. How many Americans or Greeks (or pre-war Germans) believe that their country is the greatest? How many Russians believed in Communism in 1950s? The numbers are clearly a lot higher than any cult can hope to achieve.

In particular, North Korea clamps heavily down on unauthorized information and makes up a lot of stuff. When your data is bad, it's not too surprising if your conclusions are bad.

Even people who are cynical about the regime probably aren't cynical enough. I forget the book I read this in (The Cleanest Race?) but I recall reading one story about a high-level NK official who was aware of the many abuses, but it wasn't until he learned from the Russian archives that the Korean War had actually been started by Kim Il-Sung after Stalin gave his permission (the official NK version is that the bloodthirsty capitalist SK dictator Syngman Rhee invaded NK unprovoked) that he realized just how far down the rabbit hole he had to go.

5Protagoras11y
Admittedly, from what I recall of Rhee, it's likely that the only reason he didn't invade the North is because he knew how badly he'd lose; it's totally something he would have done if he'd had a better military.
2ikrase11y
Yeah, it's actually enough to make me wonder if just forcing information into the country would trigger a rebellion...
8A1987dM11y
No “I'm not going to vote; just show me the results” option?
4DavidAgain11y
I don't think 'brainwashing' is a helpful or accurate term here, in the sense that I think most people mean it (deliberate, intensive, psychological pressure of various kinds). Presumably most North Koreans who believe such a thing do so because lots of different authority sources say so and dissenting voices are blocked out. I'm not sure it's helpful to call this 'brainwashing', unless we're going to say that people in the middle ages were 'brainwashed' to believe in monarchy, or to be racist, or to favour their country over their neighbours etc. Even outside of repressive regimes, there are probably a whole host of things that most Americans believe that most Brits don't and vice versa, and that's in a case with shared language and culture. I'm not sure 'brainwashing' can be used just because lots of people in one place believe something that hardly anyone from outside does.
2TheAncientGeek10y
There's two theories here. One is that brainwashing is a rare and ineffective thing, The other is that accultauration, or whanever is.pervasive and effective and largely unnoticed, and the reason the NRMs aren't too effective is that the standard societal indoctrination is hard to budge.
0private_messaging11y
I would estimate 66% or so, on the basis that a multitude of experiments found that about 2/3 of people are considerably more susceptible to authority than the rest, but I am not sure to which extent they managed to kill off the 1/3 , or to which extent the 1/3's conditional compliance counts towards "successfully brainwashed". edit: ahh, you say founding dictator. Well, then it could easily be higher, because it's a much less practical thing to think rebellious thoughts about right now.

about 2/3 of people are more susceptible to authority than the rest

It would seem that one could replace "2/3" with any other proper fraction and that finding would remain true.

24hodmt11y
Editing the quote to remove the "considerably" changes the meaning. The original is not a tautology because the "considerably" suggests a visible step in the curve.
8wedrifid11y
I didn't remove a word. The original was edited to change the meaning.
0private_messaging11y
Yea, you merely interpreted it in a ridiculous way that was not intended, thus requiring an extra word where none would have been needed if maxim of relevance at all held.
3wedrifid11y
Your edited version is far more useful. Thankyou.
04hodmt11y
My apologies then. It would be useful if LessWrong marked edited posts as edited.
7Douglas_Knight11y
It does mark edited comments, by an * after the date. It does not mark edits to top-level posts or edits by admins (even self-edits by admins, which is clearly a bug).
44hodmt11y
Thanks, I didn't notice the '*'s.
1Mestroyer11y
private_messaging's post is edited. I bet wedrifid quoted it as it originally was, and private_messaging edited it later to change the meaning. Edit2: (To change my posts's meaning, heh) or to clarify the original intended meaning. Edit: fixed formatting error caused by not escaping the underscore private_messaging's name.
0ChristianKl11y
If there a visible step in the curve that would be interesting. If anyone has any sources that makes such a claim, please provide it.
2private_messaging11y
Well, it still seems odd that with different set ups of e.g. Milgram experiment, various conformity experiments, and such, around 2/3 is the number rather than some dramatically different fraction (which suggests that in practice the change in susceptibility is greater around that percentile, which is of course what I meant). There really is no data to use to get any sort of specific number for North Korea, at all, but if you have to guess you have to name something. I'd be cautious of over-estimating the power of brainwashing over there. Especially considering how many people they did have to put through prison camps and such.
1ChristianKl11y
Depending on the specifics which get used during the Milgram experiment you get different results. It matters whether the person being tortured is in the same room. Whether or not you use a setting that gives you 2/3 of the people is arbitary.

I would say demonization and ostracism count as coercion. Religions use sexual identity shaming, existential fears, 'universal morality', and promises of eternal happiness in an 'afterlife' to fallaciously bring followers under bit & bridle. As soon as a religious authority stoops to the "you're being controlled by evil spirits" argument, it counts as brainwashing. Cult authorities will use this to demonize any and all forms of skepticism, sexual relationships, skipping worship sessions, or interaction with ex-members. Essentially, if you dis... (read more)