After the terrorist attacks at Charlie Hebdo, conspiracy theories quickly arose about who was behind the attacks.
People who are critical to the west easily swallow such theories while pro-vest people just as easily find them ridiculous.
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Yet very few people do so. People are guided by their previous understanding of the world, when judging new information. It sounds like a fine Bayesian approach for getting through life, but for real scientific knowledge, we can't rely on *prior* reasonings (even though these might involve Bayesian reasoning). Real science works by investigating evidence.
So, how do we characterise the human tendency to jump to conclusions that have simply been supplied by their sense of normativity. Is their a previously described bias that covers this case?
I disagree. Emotionally charged events tend to be both important, AND the ones in which rationality is most trampled. All who aspire to be less wrong in a meaningful way will benefit from sorting out the things that bias reasoning about emotionally charged current events.
It is important to be rational in such cases. But 'situations where rationality is important' isn't the same set as 'situations that are good didactic tools for rationality'. I mean, this is basically the central point of Politics is the Mind Killer: "What on Earth was the point of choosing this as an example? To rouse the political emotions of the readers and distract them from the main question?"
Charlie Hebdo wasn't brought up as the topic, it was brought up as an example of a system that could have been demonstrated with a lot less baggage.