After the terrorist attacks at Charlie Hebdo, conspiracy theories quickly arose about who was behind the attacks.
People who are critical to the west easily swallow such theories while pro-vest people just as easily find them ridiculous.
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Yet very few people do so. People are guided by their previous understanding of the world, when judging new information. It sounds like a fine Bayesian approach for getting through life, but for real scientific knowledge, we can't rely on *prior* reasonings (even though these might involve Bayesian reasoning). Real science works by investigating evidence.
So, how do we characterise the human tendency to jump to conclusions that have simply been supplied by their sense of normativity. Is their a previously described bias that covers this case?
Not really; consider how much effort is worth investigating the question of whether Barack Obama is actually secretly Transgender, in different scenarios:
If you think that even in the first case you should investigate, then you're going to spend your life running over every hypothesis that catches your fancy, regardless of how likely or useful it is. If you believe that in some cases it deserves a bit of investigation, but not others, you're going to need a few extra rules of thumbs, even before looking as the evidence.
I definitely see your point.
Couldn't the problem be solved by dividing my convictions into two groups:
Then I could go into aporia for all those who belong to group 2, while allowing more gut feeling for those in group 1. The Charlie Hebdo question doesn't affect my life quality, so I for that case I could afford the epistemological "luxury" of aporia.