It looks like telling people "everyone is biased" might make people not want to change their behavior to overcome their biases:

In initial experiments, participants were simply asked to rate a particular group, such as women, on a series of stereotypical characteristics, which for women were: warm, family-oriented and (less) career-focused. Beforehand, half of the participants were told that "the vast majority of people have stereotypical preconceptions." Compared to those given no messages, these participants produced more stereotypical ratings, whether about women, older people or the obese.

Another experiment used a richer measure of stereotyping – the amount of clichés used by participants in their written account of an older person’s typical day. This time, those participants warned before writing that “Everyone Stereotypes” were more biased in their writings than those given no message; in contrast, those told that stereotyping was very rare were the least clichéd of all. Another experiment even showed that hearing the “Everyone Stereotypes” message led men to negotiate more aggressively with women, resulting in poorer outcomes for the women.

The authors suggest that telling participants that everyone is biased makes being biased seem like not much of a big deal. If everyone is doing it, then it's not wrong for me to do it as well. However, it looks like the solution to the problem presented here is to give a little white lie that will prompt people to overcome their biases:

A further experiment suggests a possible solution. In line with the other studies, men given the "Everyone Stereotypes" message were less likely to hire a hypothetical female job candidate who was assertive in arguing for higher compensation. But other men told that everyone tries to overcome their stereotypes were fairer than those who received no information at all. The participants were adjusting their behaviour to fit the group norms, but this time in a virtuous direction.
New Comment
12 comments, sorted by Click to highlight new comments since:

I don't see how this study does any good unless first they measure the rate at which people actually match the stereotypical preconceptions and then compare this with the two average ratings. Otherwise it is possible the people were becoming less biased, not more.

Data already suggests from a number of studies that people over-estimate how much information they can glean from stereotypes. See for example the studies involving names and resumes.

Compared to those given no messages, these participants produced more stereotypical ratings, whether about women, older people or the obese.

It would be more interesting to measure the correctness of the ratings. A stereotype, unlike some definitions of "bias", is not automatically wrong; it could just as well be correct. "Men are physically stronger than women" is a stereotype which is correct and useful (the difference has a significant magnitude).

I suppose that depends on the specifics of the experiment; the brief description above doesn't really make it very clear, and the actual paper is paywalled.

When evaluating individuals, facts about the individual should screen off demographic facts. "Men are physically stronger than women" is a statement about central tendency. But an individual about whom you know they are female and a prize-winning triathlete is probably stronger than an individual about whom you know they're a man who watches 14 hours of TV every day.

Or, let's say that expert nuclear engineers are (for whatever reason) 75% men and 25% women. That means if someone tells you, "X is an expert nuclear engineer", your prior for that person being a man is 3:1. However, if you are in a professional nuclear engineering context and meet an individual woman who is introduced to you as an expert nuclear engineer, you should not assign 3:1 odds that this description is wrong and that really she is an administrative assistant or schoolteacher (or an incompetent nuclear engineer; or, for that matter, a man).

Or even 1:1 odds.

Or, you know, 1:20 odds.

In other experiments on biased behavior in hiring — résumé evaluation and the like — the evaluator is presented with detailed facts about the individual, not just their demographic facts. They have a lot to go on besides the person's gender or race or age or whatever. That's how we can be pretty confident that what's being detected is not accurate reasoning about central tendencies (as in your "men are physically stronger than women") but inaccurate reasoning about individual data points.

This formulation of evidence completely disregards an important factor of bayesian probability which is that new evidence incrementally updates your prior based on the predictive weight of the new information. New evidence doesn't completely eradicate the existence of the prior. Individual facts do not screen off demographic facts, they are supplementary facts that update our probability estimate in a different direction.

This is all true, but doesn't seem relevant. The study description says:

participants were simply asked to rate a particular group

That sounds like rating the group, not individuals. It sounds like being asked about the validity of the stereotype itself. And I'm pretty sure the stereotypes mentioned as examples are in fact true:

a series of stereotypical characteristics, for women were: warm, family-oriented and (less) career-focused

The only question is the magnitude of the true stereotypical difference, and whether people estimate it correctly.

[-]Jiro40

I don't think it would be right even when applied to individuals. If someone tells you "X is an expert nuclear engineer" and you know that X is a woman, the prior for nuclear engineers being male no longer applies, because you can observe that X is a woman with 100% certainty. But in the resume evaluation example, what the resume evaluator wants to discover (how good a worker the applicant is) is not something that he can observe. It is true, of course, that the more detailed facts on the resume also should affect the evaluator's result, but that just means that both the applicant's race/sex and the other facts should affect the result. Even if the sex/race has a small positive correlation with being a good worker and the other facts have a larger positive correlation, the evaluator is better off using both race and the other facts rather than using the other facts alone.

I wonder if it also increases expectations of in group bias by other groups.

It's possible that rather than something complex relating to the word "stereotype" itself or "giving people permission to stereotype" or whatever else comes to mind, that this is just due to "stereotype" being a negative word with negative associations. In contrast, saying "most people don't stereotype" simplifies into "most people don't do bad things" for the average study participants, and will produce positive feelings in most participants.

For example, there was hub-hub a while ago about people being more likely to stereotype if they're in a dirty place. Maybe just being told negative things about others and experiencing negativity in general leads people be more likely to do negative things such as stereotype, lie, cheat, attribute poor intentions to others, and so on. And there's also the thing where judges convict more when they're hungry.

I wonder if people were less likely to hire everyone, and being a member of some stereotyped group just makes you proportionately more targeted by people who are tired and having a bad day.

[-][anonymous]-30

I'm biased toward the truth.

Lots of people want to be, but they somehow don't end up agreeing on what the truth is. That's part of why we bother with this science stuff.

[-][anonymous]-10

That doesn't mean I know the truth, I just want to be as close to it as possible. Probably better than throwing wild guesses around.