After the terrorist attacks at Charlie Hebdo, conspiracy theories quickly arose about who was behind the attacks.
People who are critical to the west easily swallow such theories while pro-vest people just as easily find them ridiculous.
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Yet very few people do so. People are guided by their previous understanding of the world, when judging new information. It sounds like a fine Bayesian approach for getting through life, but for real scientific knowledge, we can't rely on *prior* reasonings (even though these might involve Bayesian reasoning). Real science works by investigating evidence.
So, how do we characterise the human tendency to jump to conclusions that have simply been supplied by their sense of normativity. Is their a previously described bias that covers this case?
But by saying that, you are making the exact same sort of error in question: taking low probability hypotheses and assigning them so much weight that you say you don't have sufficient knowledge when in fact you do: the correct conclusion is that the claimed explanations are extremely unlikely.
It is worth noting more generally that while on occasion, hidden government conspiracies do come to light, they are almost never any of the conspiracies that anyone in public is claiming are real.
I think my chain falls of on the idea that we can assign reliable probabilities to various hypotheses, prior to our own thorough investigation of the available scientific material.
For the case of UFOs, wouldn't we have to have scientific reports explaining all unexplained observations of aerial phenomena that have occured in history, before we could reasonably claim that the probability is very low?