I am reading a post called The Martial Art of Rationality by Eliezer Yudkowsky, in which he makes the following claim:
"If you’re a fast learner, you might learn faster—but the art of rationality isn’t about that; it’s about training brain machinery we all have in common. And where there are systematic errors human brains tend to make—like an insensitivity to scope—rationality is about fixing those mistakes, or finding work-arounds."
this post follows one in which he explained the concept of scope insensitivity by discussing a study that found that persons asked to contribute to dealing with the consequences of fuel spill on wildlife habitat did not contribute more money to save more birds. the contributors were deemed to be insensitive to the scope of the suffering.
the point I want to discuss is whether it is entirely fair to describe scope insensitivity, as defined in this way, as a "systematic human brain error"?
it seems to me that this is bordering on saying that persons who made a different choice to yours are therefore not just wrong, but suffering from something, their brain is not working properly and they need to be taught how to make better choices. where "better" obviously means, more in line with the choice you would make.
scope insensitivity would only be irrational if saving birds were the only criteria in play. to save more birds, give more money. but this is almost never the case, people are more complex than this and they need to consider more criteria than this and each person may consider different criteria and weight them differently. to label those differences as "systematic human brain error" seems to be a very one-dimensional response.
I think we need to bear in mind that the original study did not allow for the possibility that folk did not pay more because they were unable to afford more, or because they would prefer to allocate their charitable spending to alleviate human suffering rather than animal suffering. in fact, the study explicitly said that they were unable to account for the lack of sensitivity to scope. it seems wrong, in fact plain anti-intellectual, for Yudkowsky to claim that their scope insensitivity is a "systematic human brain error"?
please discuss.
One way to look at this is to pick questions where you're really sure that the two versions of the question should have different answers. For example, questions where the answer is a probability rather than a subjective value. One study some years ago asked some people for the probability that Assad's regime would fall in the next 3 months, and others for the probability that Assad's regime would fall in the next 6 months. As described in the book Superforecasting, non-superforecasters gave essentially identical answers to these two questions (40% and 41%, respectively). So it seems like they were making some sort of error by not taking into account the size of the duration. (Superforecasters gave different answers, 15% and 24%, which did take the duration into account pretty well.)