Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Asch's Conformity Experiment

16 Post author: Eliezer_Yudkowsky 26 December 2007 07:03AM

Asch2 Solomon Asch, with experiments originally carried out in the 1950s and well-replicated since, highlighted a phenomenon now known as "conformity".  In the classic experiment, a subject sees a puzzle like the one in the nearby diagram:  Which of the lines A, B, and C is the same size as the line X?  Take a moment to determine your own answer...

The gotcha is that the subject is seated alongside a number of other people looking at the diagram—seemingly other subjects, actually confederates of the experimenter.  The other "subjects" in the experiment, one after the other, say that line C seems to be the same size as X.  The real subject is seated next-to-last.  How many people, placed in this situation, would say "C"—giving an obviously incorrect answer that agrees with the unanimous answer of the other subjects?  What do you think the percentage would be?

Three-quarters of the subjects in Asch's experiment gave a "conforming" answer at least once.  A third of the subjects conformed more than half the time.

Interviews after the experiment showed that while most subjects claimed to have not really believed their conforming answers, some said they'd really thought that the conforming option was the correct one. 

Asch was disturbed by these results:

"That we have found the tendency to conformity in our society so strong... is a matter of concern.  It raises questions about our ways of education and about the values that guide our conduct."

It is not a trivial question whether the subjects of Asch's experiments behaved irrationally.  Robert Aumann's Agreement Theorem shows that honest Bayesians cannot agree to disagree—if they have common knowledge of their probability estimates, they have the same probability estimate.  Aumann's Agreement Theorem was proved more than twenty years after Asch's experiments, but it only formalizes and strengthens an intuitively obvious point—other people's beliefs are often legitimate evidence.

If you were looking at a diagram like the one above, but you knew for a fact that the other people in the experiment were honest and seeing the same diagram as you, and three other people said that C was the same size as X, then what are the odds that only you are the one who's right?  I lay claim to no advantage of visual reasoning—I don't think I'm better than an average human at judging whether two lines are the same size.  In terms of individual rationality, I hope I would notice my own severe confusion and then assign >50% probability to the majority vote.

In terms of group rationality, seems to me that the proper thing for an honest rationalist to say is, "How surprising, it looks to me like B is the same size as X.  But if we're all looking at the same diagram and reporting honestly, I have no reason to believe that my assessment is better than yours."  The last sentence is important—it's a much weaker claim of disagreement than, "Oh, I see the optical illusion—I understand why you think it's C, of course, but the real answer is B."

So the conforming subjects in these experiments are not automatically convicted of irrationality, based on what I've described so far.  But as you might expect, the devil is in the details of the experimental results.  According to a meta-analysis of over a hundred replications by Smith and Bond (1996):

Conformity increases strongly up to 3 confederates, but doesn't increase further up to 10-15 confederates.  If people are conforming rationally, then the opinion of 15 other subjects should be substantially stronger evidence than the opinion of 3 other subjects.

Adding a single dissenter—just one other person who gives the correct answer, or even an incorrect answer that's different from the group's incorrect answer—reduces conformity very sharply, down to 5-10%.  If you're applying some intuitive version of Aumann's Agreement to think that when 1 person disagrees with 3 people, the 3 are probably right, then in most cases you should be equally willing to think that 2 people will disagree with 6 people.  (Not automatically true, but true ceteris paribus.)  On the other hand, if you've got people who are emotionally nervous about being the odd one out, then it's easy to see how a single other person who agrees with you, or even a single other person who disagrees with the group, would make you much less nervous.

Unsurprisingly, subjects in the one-dissenter condition did not think their nonconformity had been influenced or enabled by the dissenter.  Like the 90% of drivers who think they're above-average in the top 50%, some of them may be right about this, but not all.  People are not self-aware of the causes of their conformity or dissent, which weighs against trying to argue them as manifestations of rationality.  For example, in the hypothesis that people are socially-rationally choosing to lie in order to not stick out, it appears that (at least some) subjects in the one-dissenter condition do not consciously anticipate the "conscious strategy" they would employ when faced with unanimous opposition.

When the single dissenter suddenly switched to conforming to the group, subjects' conformity rates went back up to just as high as in the no-dissenter condition.  Being the first dissenter is a valuable (and costly!) social service, but you've got to keep it up.

Consistently within and across experiments, all-female groups (a female subject alongside female confederates) conform significantly more often than all-male groups.  Around one-half the women conform more than half the time, versus a third of the men.  If you argue that the average subject is rational, then apparently women are too agreeable and men are too disagreeable, so neither group is actually rational...

Ingroup-outgroup manipulations (e.g., a handicapped subject alongside other handicapped subjects) similarly show that conformity is significantly higher among members of an ingroup.

Conformity is lower in the case of blatant diagrams, like the one at the top of this page, versus diagrams where the errors are more subtle.  This is hard to explain if (all) the subjects are making a socially rational decision to avoid sticking out.

Added: Paul Crowley reminds me to note that when subjects can respond in a way that will not be seen by the group, conformity also drops, which also argues against an Aumann interpretation.

 

Part of the Death Spirals and the Cult Attractor subsequence of How To Actually Change Your Mind

Next post: "Lonely Dissent"

Previous post: "Two Cult Koans"


Asch, S. E. (1956). Studies of independence and conformity: A minority of one against a unanimous majority. Psychological Monographs, 70.

Bond, R. and Smith, P. B. (1996.) Culture and Conformity: A Meta-Analysis of Studies Using Asch's ( 1952b, 1956) Line Judgment TaskPsychological Bulletin, 119, 111-137.

Comments (54)

Sort By: Old
Comment author: James_Bach 26 December 2007 08:25:49AM 0 points [-]

I don't see this exercise as being so much about rationality as it is about our relationship with dissonance. People in my community (context-driven software testers) are expected to treat confusion or controversy as itself evidence of a potentially serious problem. For the responsible tester, such evidence must be investigated and probably raised as an issue to the client.

In short, in the situation given in the exercise, I would not answer the question, but rather raise some questions.

I drive telephone surveyors nuts in this way. They just don't know what to do with a guy who answers "no opinion" or "I don't know" or "can't answer" to every single question in their poorly worded and context-non-specific questionnaires.

Comment author: James_Annan 26 December 2007 09:39:15AM 0 points [-]

Robert Aumann's Agreement Theorem shows that honest Bayesians cannot agree to disagree - if they have common knowledge of their probability estimates, they have the same probability estimate.

Um, doesn't this also depend on them having common priors?

James

Comment author: pnrjulius 09 April 2012 05:32:16AM 2 points [-]

Yes. More importantly, it depends on them being honest Bayesians, which humans are not.

Comment author: Vladimir_Nesov2 26 December 2007 09:48:07AM 1 point [-]

It feels like there was no explicit rule not to ask questions. It's interesting what percentage of subjects actually questioned the process.

If people are conforming rationally, then the opinion of 15 other subjects should be substantially stronger evidence than the opinion of 3 other subjects.

I don't see how moderate number of other wrong-answering subjects should influence decision of rational subject, even if it's strictly speaking stronger evidence, as uncertainty in your own sanity should be much lower than probability of alternative explanations for wrong answers of other subjects.

Comment author: Paul_Crowley2 26 December 2007 12:29:32PM 3 points [-]

The video notes that when the subject is instructed to write their answers, conformity drops enormously. That suggests we can set aside the hypothesis that they conform for the rational reason you set out.

Comment author: anonymous7 26 December 2007 12:33:16PM 7 points [-]

90% of drivers can be better than the average.

Comment author: pnrjulius 09 April 2012 05:32:57AM 1 point [-]

Only in a hella skewed distribution, far from the observed distribution of actual driving behavior.

Comment author: downie 22 June 2014 01:45:02PM 2 points [-]

Depends on how you measure it. For example, 99.9% of drivers have caused a below-average number of road fatalities.

Comment author: Chris 26 December 2007 12:39:31PM 2 points [-]

'This may come as some surprise' to Asch & Aumann, but rationality is not the design point of the human brain (otherwise this blog would have no reason to exist), getting by in the real world is. And getting by in the real world involved, for our ancestors through tens of millenia, group belonging, hence group conformity. See J. Harris, 'No Two Alike', Chaps. 8 & 9 for a discussion which references the Asch work. This does not mean of course that group conformity was the only adaptation factor. Being right and being 'in' both had (and have...) fitness value, and it's pefectly natural that both tendencies exist, in tension.

Comment author: halcyon 08 April 2012 12:01:56PM 0 points [-]

traditional culture =/= the human brain

Comment author: Steve_Shervais 26 December 2007 12:49:19PM 2 points [-]

At an applied level, this reminds me of Dr. Jerry B. Harvey’s discussion of the "Abilene Paradox" in management, where groupthink can take over and move an organization in a direction that no-one really wants to go. All it takes is one dissenter to break the spell.

Comment author: Recovering_irrationalist 26 December 2007 01:58:03PM 0 points [-]

Surely there's more than social conformity/conflict aversion at work here? In the experiment in the video, an expectation of pattern continuation is set up. For most questions, the 4 spoken words the subject hears before responding do correspond to the apparently correct spoken word response. I'd expect subconcious processes to start interpreting this as an indicator of the correct answer regardless of social effects and be influenced accordingly, at least enough to cause confusion which would then increase susceptibility to the social effects.

I'd expect this effect to also be reduced where the subject is writing down his answers, as that takes out of the equation the close connection between hearing spoken numbers and speaking spoken numbers.

Comment author: Caledonian2 26 December 2007 02:37:51PM 2 points [-]

Aumann's Agreement Theorem was proved more than twenty years after Asch's experiments, but it only formalizes and strengthens an intuitively obvious point - other people's beliefs are often legitimate evidence.

No, other people's beliefs are often treated as evidence, and very powerful evidence at that.

Belief is not suitable as any kind of evidence when more-direct evidence is available, yet people tend to reject direct evidence in order to conform with the beliefs of others.

The human goal usually isn't to produce justified predictions of likelihood, but to ingratiate ourselves with others in our social group.

What are you attempting to do, Eliezer?

Comment author: Joshua 12 February 2011 08:03:06PM 0 points [-]

Isn't this exactly what was said in Hug The Query? I'm not sure I understand why you were down voted.

Comment author: Blueberry 12 February 2011 10:41:18PM 0 points [-]

Caledonian was a well-known LW troll who would frequently make vague, unreadable, critical, somewhat hostile remarks.

Comment author: pnrjulius 09 April 2012 05:33:52AM 0 points [-]

So it's guilt by association.

Comment author: Kenny 21 January 2013 10:16:24PM 0 points [-]

"Belief is not suitable as any kind of evidence when more-direct evidence is available ..." is more like 'You Can Only Ever Hug The Query By Yourself'.

Comment author: StuartBuck 26 December 2007 02:51:13PM 3 points [-]

FYI, if you look at Asch's 1955 Scientific American article, the lines on the cards were a little closer in length than in the example shown above.

Comment author: Steve 26 December 2007 05:09:07PM 2 points [-]

my vision is so bad that i answered 'none of the above'. i had to decide to measure the lines. that meant i first had to get to where i did not think the trick was the question. that took a cup of tea. 'trust the ruler, not the vision' has been added to my list of -ings.

Comment author: Nominull2 26 December 2007 05:10:46PM 1 point [-]

Isn't it reasonable to find it more likely that people are lying than that something has gone that flagrantly wrong with my ability to judge sizes of lines?

Comment author: pnrjulius 09 April 2012 05:34:43AM 0 points [-]

Not necessarily. Maybe your eyes are very bad, or you've suffered a stroke. (Though maybe you should be concerned about that and halt the experiment, rather than just agreeing.)

Comment author: Unknown_Healer 26 December 2007 05:56:52PM 1 point [-]

"Belief is not suitable as any kind of evidence when more-direct evidence is available, yet people tend to reject direct evidence in order to conform with the beliefs of others."

Caledonian, this is just wrong. Our ability to interpret evidence is not infallible, and is often fallible in ways that are not perfectly correlated across individuals. So even if we share the same 'direct evidence' as other observers of equaly ability their beliefs are still relevant.

Comment author: Psy-Kosh 26 December 2007 06:50:32PM 1 point [-]

Except we'd have to take into account the idea that the others who's beliefs we are using as evidence may themselves have been using the same idea... That results weighting of the beleifs of an initial group being greatly amplified above and beyond what it should be, no?

Comment author: Sebastian_Hagen2 26 December 2007 07:49:45PM 4 points [-]

Robert Aumann's Agreement Theorem shows that honest Bayesians cannot agree to disagree - if they have common knowledge of their probability estimates, they have the same probability estimate.

In addition to what James Annan said, they also both have to know (with very high confidence) that they are in fact honest bayesians. Both sides being honest isn't enough if either suspects the other of lying.

Comment author: Richard_Kennaway 26 December 2007 09:21:50PM 3 points [-]

In terms of individual rationality, I hope I would notice my own severe confusion and then assign >50% probability to the majority vote.

Noticing your own severe confusion should lead to investigating the reasons for the disagreement, not to immediately going along with the majority. Honest Bayesians cannot agree to agree either. They must go through the process of sharing their information, not just their conclusions.

Comment author: Dave3 27 December 2007 03:17:39AM 2 points [-]

What are the odds, given today's society, that a randomly selected group of people will include any honest Bayesians. Safer to assume that most of the group are either lying, self-deluded, confused, or have altered perceptions. Particularly so in a setting like a psychology experiment.

Comment author: pnrjulius 09 April 2012 05:36:47AM 0 points [-]

Strict honest Bayesians? ZERO. (Not even LW contains a single true honest Bayesian.)

Approximations of honest Bayesians? Better than you might think. Certainly LW is full of reasonably good approximations, and in studies about 80% of people are honest (though most people assume that only 50% of people are honest, a phenomenon known as the Trust Gap). The Bayesian part is harder, since people who are say, religious, or superstitious, or believe in various other obviously false things, clearly don't qualify.

Comment author: Jason_Brennan 27 December 2007 04:40:14AM 0 points [-]

Check out this paper:

Gregory S. Berns, Jonathan Chappelow, Caroline F. Zink, Giuseppe Pagnoni, Megan E. Martin-Skurski, and Jim Richards, “Neurobiological Correlates of Social Conformity and Independence During Mental Rotation,” Biological Psychiatry 58 (2005), pp. 245-253.

It claims that the conformists can, under some conditions, actually come to see the world differently.

Comment author: Psy-Kosh 27 December 2007 07:39:23AM 0 points [-]

Oh, one other thing. I know it's been brought up before, but as far as the agreement theorem, I don't feel I can safely use it. What I mean is that it seems I don't understand exactly when it can and cannot be used. Specifically, I know that there's something I'm missing here, some understanding because I don't know the correct way to resolve things like agreement theorem vs quantum suicide.

It's been discussed, but I haven't seen it resolved, so until I know exactly why agreement theorem does not apply there (or why the apparently straightforward (to me) way of computing the quantum suicide numbers is wrong), I'd personally be really hesitant to use the agreement theorem directly.

Comment author: pnrjulius 09 April 2012 05:38:58AM 1 point [-]

The quantum suicide numbers are wrong because of the Born probabilities, and also the fact that consciousness is not an either-or phenomenon. The odds of losing 99% of your consciousness may be sufficiently high that you effectively have no consciousness left. (Also: Have you ever been unconscious? Apparently it is possible for you to find yourself in a universe where you WERE unconscious for a period of time.)

Also, I've convinced that Many-Worlds is a dead end and Bohm was right, but I know I'm in the minority on LW.

Comment author: Unknown 27 December 2007 10:32:40AM 1 point [-]

Perhaps Eliezer or someone else can check the math, but according to my calculations, if you use Nick Bostrom's SSSI (Strong Self-Sampling Assumption), and make the reference class "observers after a quantum suicide experiment", then if the prior probability of quantum immortality is 1/2, after a quantum suicide experiment has been performed with the person surviving, both the outside observer and the person undergoing the risk of death should update the probability of quantum immortality to 4/7, so that they end up agreeing.

This seems odd, but it is based on the calculation that if the probability of quantum immortality is 1/2, then the probability of ending up being an observer watching the experiment is 17/24, while the probability of being an observer surviving the experiment is 7/24. How did I derive this? Well, if Quantum Immortality is true, then the probability of being an observer watching the experiment is 2/3, because one observer watches someone die, one observer watches someone survive, and one observer experiences survival. Likewise if QI is true, the probability of being an observer surviving the experiment is 1/3. On the other hand, if QI is false, the probability of being an observer watching the experiment is 3/4 (I will leave this derivation to the reader), while the probability of being an observer surviving the experiment is 1/4.

From this it is not difficult to derive the probabilities above, that the probability of being a watcher is 17/24, and the probability of being a survivor 7/24. If you apply Bayes's theorem to get the probability of QI given the fact of being a survivor, you will get 4/7. You will also get 4/7 if you update your probabilities both on the fact of being a watcher and on the fact of seeing a survivor. So the two end up agreeing.

Intuitive support for this is the fact that if a QI experiment were actually performed, and we consider the viewpoint of the one surviving 300 successive trials, he would certainly conclude that QI was true, and our intuitions say that the outside observers should admit that he's right.

Comment author: pnrjulius 09 April 2012 05:40:13AM 0 points [-]

Interesting. If that's right, then clearly QI is wrong, because we've watched people die.

Comment author: Unknown 27 December 2007 11:32:00AM 0 points [-]

In the above calculation I forgot to mention that for simplicity I assumed that the experiment is such that one would normally have a 50% chance of survival. If this value is different, the values above would be different, but the fact of agreement would be the same (although there would also be the difficulty that a chance other than 50% is not easy to reconcile with a many-worlds theory anyway.)

Comment author: Nick_Tarleton 27 December 2007 03:35:45PM 1 point [-]

Quantum suicide vs. Aumann has been discussed a couple times before, and yes, it's very confusing.

Intuitive support for this is the fact that if a QI experiment were actually performed, and we consider the viewpoint of the one surviving 300 successive trials, he would certainly conclude that QI was true, and our intuitions say that the outside observers should admit that he's right.

My intuitions say outside observers should not update their estimates one bit, and I'm pretty sure this is correct, unless they should also increase their probability of MWI on making the equivalent observation of a coin coming up heads 300 times in a row.

(although there would also be the difficulty that a chance other than 50% is not easy to reconcile with a many-worlds theory anyway.) http://www.hedweb.com/everett/everett.htm#probabilities http://hanson.gmu.edu/mangledworlds.html

Comment author: steven 27 December 2007 04:01:45PM 1 point [-]

IMHO quantum immortality and quantum suicide (unlike MWI) are nonsense, but I'm still trying to figure out a way to say this that convinces other people.

For probabilities in MWI I recommend the work of David Wallace.

Comment author: Unknown 27 December 2007 04:09:27PM 0 points [-]

Nick, my argument didn't depend on intuition except for support; so it doesn't bother me if your intuition differs. What was your opinion of the argument (or did I simply omit too many of the details to judge)?

Comment author: Someone 27 December 2007 06:10:40PM 1 point [-]

I think the most interesting question that arises from these experiments is what's the difference in personality between people who dissent and people who conform (aside from the obvious).

Comment author: pnrjulius 09 April 2012 05:44:39AM 1 point [-]

I would guess that if we did a study using the usual Big Five, a single personality trait would drive most of the variance, the one called "agreeableness". Unfortunately this is not actually one trait, we just treat it like it is; there's no particular reason to think that conformity is correlated with empathy, for example, yet they are both considered "agreeableness". (This is similar to the problem with the trait "Belief in a Just World", which includes both the belief that a just world is possible and the belief that it is actual. An ideal moral person would definitely believe in the possibility; but upon observing a single starving child they would know that it is not actual. Hence should they be high, or low, in "Belief in a Just World"?)

Comment author: Psy-Kosh 27 December 2007 09:06:33PM 0 points [-]

Unknown: Hrm, hadn't thought of using the SSSI. Thanks. Ran through it myself by hand now, and it does seem to result in the experimenter and test subject agreeing.

However, it produces an... oddity. Specifically, if using the SSSI, then by my calculations, when one takes into account that the external observer and the test subject are not the only people in existance, the actual strength of evidence extractable from a single quantum suicide experiment would seem to be relatively weak. If the ratio of non test subjects to test subjects is N, and the probability of the subject surviving simply by the nature of the quantum experiment is R, the likelihood ratio is (1+N)/(R+N), (which both the test subject and the external observer would agree on). Seeing a nonsurvival gives a MWI to ~ MWI likelihood ratio of N/(R+N). At least, assuming I did the math right. :)

Anyways, so it looks like if SSSI is valid, quantum suicide doesn't actually give very strong evidence one way or the other at all, does it?

Hrm... I wonder if in principle it could be used to make estimates about the total population of the universe by doing it a bunch of times and then analyzing the ratios of observed results... *chuckles* May have just discovered the maddest way to do a census, well, ever.

Comment author: pnrjulius 09 April 2012 05:46:32AM 1 point [-]

Clearly it can't actually matter what the population of the universe is. (There's nothing about the experiment that is based on that! It would be this bizarre nonlocal phenomenon that pops out of the theory without being put into it!) That's the kind of weirdness you come up with if you do anthropic calculations WRONG.

Comment author: Psy-Kosh 27 December 2007 10:18:58PM 0 points [-]

Actually, if considering the SSSA instead of just the SSA, one has to take into account all the observer-moments, past and future, right? So there well be, in addition to the specific observer moments of "immediately post experiment test subject (or not), experimenter, everyone else...", there'll be past and future versions theirof, and of other entities, so you'll have K1 total "others" (other observer-moments, that is) in a MW universe, and K2 << K1 "others" in a single world universe.

This'll make the calculation a bit more confusing.

Comment author: Sam4 08 January 2008 11:28:24AM 0 points [-]

"... then what are the odds that only you are the one who's right?"

If this is the reasoning for people choosing the same answer then surely it becomes a question of confidence rather than conformity?

Choosing the same answer as the group in your argument is because you aren't confident in your answer and are willing to defer to the majority answer. Not necessarily the same as conformity. By your own rationing you are going with the group because you think their answer is "better" not because you want to be part of the group. I know you can argue that that is just your rationale for conformity, but I feel that conformity is more about doubting something you are sure you know, to side with a group, rather than doubting something you think you might know.

I feel possibly a more accurate test (using this reasoning for conformity) would be to take a group and tell all the members individually that only they will know the right answer. Then give all bar one the same answer and one a different answer and see if they will conform with the group.

Comment author: Leeroy_Jenkins 08 May 2008 03:42:45PM -1 points [-]

I believe that the subjects were of those of a non-matured state, thus making them of a "childish" mind and not able to process the situation. The subjrects would simply say anything their peers would say or do. I am testing this experiment on my classmates. I am in the 10th grade and will respond back with the solution. I blieve that a matured mind would not give in so easily with a simple question. It is not the question at hand that is making the subjects say something completely incorrect, it is the group pressure and the maturity of the subjects. If a child's mind thinks he or she is to believe that of another subject, then it shall think of that at hand. Children's minds are so open and naive thatt they will believe something as simple as Santa Clause comming down the chimney every year, then they will not hesitate to think of an answer to the question of this experiment. It is a simple and most uneducated experiment I had to present and test. A matured mind will think not of the group pressure but that of the question. I will be back with my results. Thank you.

Leeroy Jenkins

Comment author: pnrjulius 09 April 2012 05:47:15AM 1 point [-]

These were adult subjects, so by your (unusual) definition most adults are "immature".

Comment author: Sadun_Kal 20 August 2008 01:50:12AM 0 points [-]

"I believe that the subjects were of those of a non-matured state..."

I guess that's the difference between being biased or not. I think your understanding of a "mature mind" equals an "unbiased mind" which is not present in all the adults. And of course the result of this experiment would have been different if it were conducted on the readers of this website.

Comment author: BlackHumor 02 November 2010 01:36:10AM 0 points [-]

I don't see why you think that 3 extra people, no matter if they're honest or not, amount to any significant amount of evidence when you can see the diagram yourself.

Sure, maybe they're good enough if you can't see the diagram; 3 people thinking the same thing doesn't often happen when they're wrong. But when they are wrong, when you can see that they are wrong, then it doesn't matter how many of them there are.

Also: certainly the odds aren't high that you're right if we're talking totally random odds about a proposition where the evidence is totally ambiguous. But since there is a diagram, the odds then shift to either the very low probability "My eyesight has suddenly become horrible in this one instance and no others" combined with the high probability "3/4 people are right about a seemingly easy problem", versus the low probability "3/4 people are wrong about a seemingly easy problem", versus the high probability "My eyesight is working fine".

I don't know the actual numbers for this, but it seems likely the the probability of your eyesight suddenly malfunctioning in strange and specific ways is worse then the probability of 3 other people getting an easy problem wrong. Remember, they can have whatever long-standing problems with their eyesight or perception or whatever anyone cares to make up. Or you could just take the results of Asch's experiment as a prior and say that they're not that much more impressive than 1 person going first.

(All this of course changes if they can explain why C is a better answer; if they have a good logical reason for it despite how odd it seems, it's probably true. But until then, you have to rely on your own good logical reason for B being a better answer.)

Comment author: handoflixue 24 May 2011 11:53:11PM *  1 point [-]

"I hope I would notice my own severe confusion and then assign >50% probability to the majority vote."

On a group level, I wouldn't think it's a particularly rational path to mimic the majority, even if you believe that they're honestly reporting. If you had a group of, say, 10 people, and the first 5 all gave the wrong answer, there would then be a rational impetuous for everyone subsequent to mimic that wrong answer on the logic that "the last (5-9) people all said C, so clearly p(C) > 0.5".

Far better to dissent and provide the group with new information.

Comment author: pnrjulius 09 April 2012 05:49:22AM 1 point [-]

Ooh, that's really interesting. The best solution might actually be to say the full statement, "I see B as equal, but since the other 5 people before me said C, C is probably objectively more likely." Then future people after you can still hear what you saw, independently of what you inferred based on others.

But I think there are a lot of other really interesting problems embedded in this, involving the feedback between semi-Bayesians trying to use each other to process evidence. (True Bayesians get the right answer; but what answer to semi-Bayesians get?)

Comment author: smallricochet 03 September 2011 10:16:58PM 0 points [-]

I don't claim to have much knowledge in this, which gives me free reign to say this little example: Four girls, A B C and D, were talking about Girl E, all negative. Girl A finally says, "Girl E is my friend, I don't see a problem with her." All four girls stopped talking about Girl E, suddenly uncomfortable. (This was not because they were afraid of offending Girl A, but because they truly didn't have anything against Girl E in the first place.) But there was immense pressure for Girl A just to conform and keep gossiping, this uncannily similar to the groupthink example (which was on a bigger, more consequential scale).

I always thought it was social-rational instinct, a mix between pre-conceived morals, personality, and consideration for the future. Nothing experiments can so specifically pin down, but sort of obvious nonetheless?

Well, think of it this way. If everyone was so confident in themselves and dissented whenever their view didn't conform with someone else's, where would humans be? (Of course, this either led to monarchies, dictatorships, and finally, democracy. What's the fine line between wanting things to be done, and wanting things to be done with everyone's opinion? Variety is good, you need both.)

I hope that was all on track and relevant.

Comment author: pnrjulius 09 April 2012 05:32:01AM 1 point [-]

This gives us a very good reason to publicize dissenting opinions about just about anything---even perhaps when we think those dissents are wrong. Apparently the mere presence of a dissenter damages groupthink and allows true answers a much better chance to emerge.

Comment author: avichapman 01 June 2012 01:53:00AM 1 point [-]

I was all set to ask whether the result of female groups' increased conformity had any explanatory power over the question of why there aren't more woman in the rationalist movement. Then as I read on, it became less likely that female psychology had anything to do with it. Rather, in-group vs out-group psychology did. Males, being the socially more privileged gender, are more likely to see themselves as 'just normal' rather than part of a particular group called 'males'.

Of course, this lends itself to predictions. In a given grouping that self-identifies strongly as that grouping (such as woman, minority ethnicities, etc), if that group is very into a particular subject, its members will also likely be into it. Whereas, with a group that is less likely to self identify (such as American Caucasians, Americans within American borders (but not abroad) and men) the conformity on interests will be less.

Have there been any studies done to test this minority vs majority group conformity idea?

Comment author: avichapman 04 June 2012 12:27:36AM 0 points [-]

I'm not upset about losing points for this post, but I am a bit confused about it. Many out there know more about this stuff than I do. Did I say something factually inaccurate or engage in bad reasoning? I want to know so that I don't repeat my mistake.

Comment author: TimS 04 June 2012 02:46:08AM 0 points [-]

Your first paragraph mentions a highly contested thesis that you admit is irrelevant to the evidence. Your second paragraph seems to assert that dominant groups do not strongly-self identify - which seems empirically false - consider spontaneous chants of "USA, USA, USA"

Also, you are using some quasi-technical jargon less precisely than the terms are usually used - and your misuses seem to be directed at supporting a particular ideological position.

But that's just the sense of someone who probably has a contrary ideological position, so I'm not sure how I would recommend you generalizing from my impression. (and the downvote is gone at the moment I'm writing this - was it just one? Just ignore those if you can't figure them out.)

Comment author: avichapman 04 June 2012 03:25:30AM 0 points [-]

Ah.

I had suspected that it might be because someone had tried to infer my position on such matters from my asking of the question and didn't like the implication. I did, after all, admit to including the thesis that 'the observed high conformance of a group of females is influenced by an aspect of female psychology' in my list of possible explanations for the high conformance in that group, even though I ended up rejecting that hypothesis.

(I suspect that your position viz a viz whether either gender is superior is not that different than my own. But to be clear, my position is that both genders possess great capacity for type 2 cognition, which is the most important measurement of human success. Any difference between healthy adults of either gender in their use of such cognition comes down to social factors, which can be changed to create a fairer society.)

I'm still surprised about the second paragraph's inaccuracy, though. In my experience, the chants of "USA, USA, USA" occur at sporting matches against other countries. That's not an 'internal to America' thing. Then again, I don't live in America and haven't for many years. I chose America because I was trying to cater my words to my audience. Perhaps that was wrong and I should have spoken from experience instead. (I'm Australian.)

I want to use every word accurately, so I would be most appreciative if you could give me a few examples of jargon I've used and a description (or link to one) of the way it should actually be used.

Thanks, Avi

PS - Yes. It was just one vote, so maybe I got re-upvoted or something. Oh well. The experienced alerted me to an issue. That's all anyone could ask of it.