Lumifer comments on Welcome to Less Wrong! (8th thread, July 2015) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (239)
Hello everyone,
I'm a PhD student in social psychology focusing my time mainly on applied statistics and quantitative methods for the study of brain and behavior. My research focuses on the way that people's goals influence the way they reason and form judgments, but I've also dabbled a bit in self-regulation/self-control.
Perhaps my attraction to this community is based on the fact that I feel that my field is an unfriendly environment for the free exploration of novel or uncommon ideas. Specifically, I suspect that many of the models of human decision-making being put forth by our field over-estimate the tendency for biases/heuristics to lead to errors or poor judgments. For example, few (if any) of my colleagues are aware that our stereotypes of other groups tend to be highly accurate and this effect is one of the largest effects in all of social psychology. It appears that, in many cases, our biases tend to improve accuracy and decision-making quality. However, to utter phrases like "stereotype accuracy" around most social psychologists is to invite suspicion about one's underlying motives. I'm here not because I want to talk about stereotype accuracy in particular, but because I'd like to be able to consider such an idea without the threat of damaging my reputation and career.
I also like thinking about AI and how an (accurate) understanding of human reasoning in information-starved contexts could help us design AI responsibly, but that's just whipped cream.
Since you are going to spend a lifetime working in this field, you... may have problems.
I'm unlikely to remain in academia after getting the degree. While I was coming to terms with the problems I'd face in academia, I was delighted to learn that there's a non-trivial demand in private industry for people who know how to quantify psychological constructs in a way that produces actionable information.
Oh, good.
You're a bit late, though, LessWrong is mostly a graveyard now. A lot of people from here moved over to the Scott Alexander's blog which is highly recommended.
I was wondering about that. In what sense is this place a graveyard?
It's too bad really. I love Scott's blog, but I've been looking for something with a format more like LW.
A quiet place from which most souls have departed, but which references a lot of accomplishments in days past.
I, too, think the SSC's comment format is unfortunate, but it's up to Scott to do something about it. In fact, I think he treats it as a feature to avoid comment overload.
LW used to get a lot more traffic in the past but don't let that stop you from contributing. How about writing up a longer post on your thesis about stereotype accuracy?
That specific thesis is mostly just an example. Much of what I would say would be paraphrasing the work of someone else (Lee Jussim mainly) and explaining its relevance to this community. I could do this if people thought it would be productive, but its just one of many topics that I think are misunderstood on a large scale.
My more general interest is in the less-known fact that many of our hardwired biases and heuristics were designed by natural selection (e.g., negativity bias) to improve accuracy based on goal-relevant criteria. It also seems that the biases formed in response to the environment (e.g., much of the content comprising a stereotype) track reality to a surprising degree. Imagine a marksman who practices shooting at the same firing range everyday and this range generally has a side-wind in the same direction and intensity. The marksman can manually adjust for this by placing his reticle upwind of the target, but he could also adjust his scope's reticle such that he can aim for the bulls-eye and account for the wind at the same time. Once the adjustment is made to the scope, he many have a "biased" tool, but his shots are still centered on the bullseye (on average) and the only online calculations needed to account for the wind on a shot-by-shot basis are minute. What if the marksman moves to another range? Well, in time, he will see his shots wildly missing and make the proper adjustments. This is probably not a novel analogy, but the surprising thing to me is that social psychology tends to frame any "reticle adjustment" as a bias against which we must fight without testing its performance in the contexts under which the adjustment was made. It's not that biases and heuristics don't cause problems, its that we have a much poorer understanding of when they cause problems than our field claims.
This general idea applies to stereotypes, but also:
In these spheres people generally understand that heuristics optimize for something. Frequently people think they optimize for some ancestral environment that's quite unlike the world we are living in at the moment. I think that's a question where a well written post would be very useful.
I would think that many sociologists would say that many people who are racist and look down on Blacks are racists because they don't interact much with Blacks. If the adjustment was made during a time where the person was at an all-White school, the interesting question isn't whether the adjustment performs well within the context of the all-White school but whether it also performs well at decisions made later outside of that heterogeneous environment.
It was poor wording on my part when I wrote "the contexts under which the adjustment was made". The spirit of my point is much better captured by the word "applied" (vs. made). That is, it looks like a balanced reading of stereotype literature shows that people are quite good in their judgments of when to apply a stereotype. My point is therefore a bit more extreme than it might have appeared.
I agree with this and would add that such perceptions of superiority could be amplified by other members of the community reinforcing those judgments.
To get a little deeper into this topic, I should mention that our stereotypes are conditional and, therefore, much of the performance of a stereotype depends on applying it in the proper contexts. Of the studies looking at when people apply stereotypes, they tend to show that they are used as a last resort under conditions in which almost no other information about the target is available. We're surprisingly good at knowing when a stereotype is applicable and seem to have little trouble spontaneously eschewing them when other, more diagnostic information is available.
My off-the-cuff hypothesis about students from an all-white school would be that they would show racial preferences when, say, only shown a picture of a black person. However, ask these students to provide judgments after a 5-minute conversation with a black person or after reviewing a resume (i.e., after giving them loads and loads of information) and race effects will become nearly or entirely undetectable. I don't know of any studies looking at this exactly and urge you to take my hypothesis with a grain of salt, but my larger point is this: You might be surprised.
From memory without Googling the studies I remember that there are studies that test whether having a "Black name" on a resume will change response rates and it does.
There are also those studies that suggest that blinding of piano players gender is required to remove a gender bias.
Do you have another read on the literature?
So, I'm pretty sure we know that humans have a bias against anyone sufficiently different, and that this evolved before humanity as such. We certainly know that humans will try to rationalize their biases. We also have a great deal of evidence for past failures of scientific racism, which has set my prior for the next such theory very low.