Is there a name for this bias?
There are certain harmful behaviors people are tricked into engaging in, because whereas the benefits of the behavior are concentrated, the harms are diffuse or insidious. Therefore, when you benefit, P(benefit is due to this behavior) ≈ 1, but when you're harmed, P(harm is due to this behavior) << 1, or in the insidious form, P(you consciously notice the harm) << 1.
An example is when I install handy little add-ons and programs that, in aggregate, cause my computer to slow down significantly. Every time I use one of these programs, I consciously appreciate how useful it is. But when it slows down my computer, I can't easily pinpoint it as the culprit, since there are so many other potential causes. I might not even consciously note the slowdown, since it's so gradual ("frog in hot water" effect).
Link: Compare your moral values to the general population
Jonathan Haidt, a professor at UVA, runs an online lab with quizzes that will compare your moral values to the rest of the population. I have found the test results useful for avoiding the typical mind fallacy. When someone disagrees with me on a belief/opinion I feel certain about, it's often difficult to tease apart how much of this disagreement stems from them not "getting it", and how much stems from them having a different fundamental value system. One of the tests alerted me that I am an outlier in certain aspects of how I judge morality (green = me; blue = liberals; red = conservatives):

Another benefit of these quizzes is that they can point out potential blind spots. For example, one quiz asks for opinions about punishment for crimes. If I discover I'm an outlier w.r.t. the population, I should reconsider whether my opinions are based on solid evidence (or did I see one study that found tit-for-tat punishment effective in a certain context, and take that as gospel?).
Extra reading: Haidt wrote a WSJ article last month that applied the learnings of these moral quizzes to better understanding the Tea Party.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)