Here are the relevant LW keywords:
Motivated Stopping = "I have evidence for X (which I like), so I make my conclusion now and refuse to look at further evidence"
Motivated Continuation = "I have evidence for X (which I dislike), so I avoid making conclusions and keep looking for more evidence"
...and in situations where it seems obvious that both the existing and the future evidence will mostly point towards X (which I dislike), I can go more meta and use...
Fallacy of Gray = "there will never be a perfect evidence for X (nor a perfect evidence against X), so my conclusion is that one cannot rationally make any conclusion about this topic"
You've previously brought up here also the idea that UFOs might be aliens.
May I suggest that you are maybe just giving way too high a probability to low probability, fringe, hypotheses?
But by saying that, you are making the exact same sort of error in question: taking low probability hypotheses and assigning them so much weight that you say you don't have sufficient knowledge when in fact you do: the correct conclusion is that the claimed explanations are extremely unlikely.
It is worth noting more generally that while on occasion, hidden government conspiracies do come to light, they are almost never any of the conspiracies that anyone in public is claiming are real.
I think my chain falls of on the idea that we can assign reliable probabilities to various hypotheses, prior to our own thorough investigation of the available scientific material.
Yep! We do it all the time! How likely do you think it is that the city of New York has just been destroyed by a nuclear blast? That your parents are actually undercover agents sent by Thailand? That there is a scorpion in the sandwich you're about to eat? Most people would consider those extremely unlikely without a second thought, and would not feel any need for a "thorough investigation of the available scientific material". And that's a perfectly sensible thing to do!
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Not really; consider how much effort is worth investigating the question of whether Barack Obama is actually secretly Transgender, in different scenarios:
If you think that even in the first case you should investigate, then you're going to spend your life running over every hypothesis that catches your fancy, regardless of how likely or useful it is. If you believe that in some cases it deserves a bit of investigation, but not others, you're going to need a few extra rules of thumbs, even before looking as the evidence.
I'm not sure where to start with this but, I don't think finding a conspiracy theory ridiculous is just a matter of pro-west bias, and that the correct position is "maybe it was a conspiracy theory, maybe it was just what it looks like"? You'd have to have some pretty iffy priors for that to work.
I can't add much to the other comments.
In this case, I actually hadn't heard about the false flag conspiracy theories. I was thinking about the less extreme tendency of the foreign press to see the occurrence of blasphemy as being secretly orchestrated by Western governments, who them deny involvement and claim that they are just supporting the spontaneous actions of their citizens. My theory is that this will seem more plausible to people living in areas with more state control over the press.
For example, there was an American preacher who burned some Korans. The view from within the United States was this man was a low-status Southern Evangelical stereotype, a headache to the government, and a visible annoyance to the high-status people who really do love insulting Islam but were somehow stuck defending this guy instead of some more charismatic blasphemer. In the Islamic press, this guy was a practically a CIA agent/preacher of the Official American Church acting on Presidential orders, probably working with Israel to further some foreign policy aim. Because if he really was a headache to the government, why wasn't he already in jail?
I don't know about killing the agents, but there have been known examples of that kind of false flag terrorist attack. Probably the most famous is the Lavon Affair, but there's also:
There have also been a number of false-flag incidents in which a government attacked its own people with terrorism, but, whether by luck or by intention, no-one died:
There are also various well-known incidents that look like false flag terrorist attacks by a government on its own people, but which are disputed:
It is also possible that there were other such incidents but where the false-fl...
People who are critical to the west easily swallow such theories while pro-vest people just as easily find them ridiculous.
In my opinion, people who understand the positives and negatives of western information flow recognize that the information flow claiming it was islamic fundamentalists is correct, that it is not some conspiracy to blame muslims, while the people who accept various non-western information sources such as pronouncements by various mullahs, and do not really have a detailed understanding of how the west lies and how it doesn't, get th...
Why do this post get so many down votes? The topic isn't really about Charlie Hebdo. I could have used any other example in which emotionally strong counter theories has arisen.
My guess is that it is because
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
and
It sounds like a fine Bayesian approach for getting through life, but for real scientific knowledge, we can't rely on prior reasonings (even though these might involve Bayesian reasoning). Real science works by investigating evidence.
look like a significant misunderstanding of what the bayesian approach is.
It is important to be rational in such cases. But 'situations where rationality is important' isn't the same set as 'situations that are good didactic tools for rationality'. I mean, this is basically the central point of Politics is the Mind Killer: "What on Earth was the point of choosing this as an example? To rouse the political emotions of the readers and distract them from the main question?"
Charlie Hebdo wasn't brought up as the topic, it was brought up as an example of a system that could have been demonstrated with a lot less baggage.
This is a valid question. But because of the politics involved it should have been posted to e.g. the Open Thread.
The western powers claim this was an attack on their "free speech", but if so, it was only backup and catalyst to their own long-term goals of eliminating that value in the first place. Even now, people who question this narrative are being silenced through every available legal method, and the scope of such methods is only expanding. European governments want us to live in double think - concurrently believing that we're defending ourselves from an enemy who hates our freedom of speech (as opposed to what we have to say) and supporting the gover...
After the terrorist attacks at Charlie Hebdo, conspiracy theories quickly arose about who was behind the attacks.
People who are critical to the west easily swallow such theories while pro-vest people just as easily find them ridiculous.
I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Yet very few people do so. People are guided by their previous understanding of the world, when judging new information. It sounds like a fine Bayesian approach for getting through life, but for real scientific knowledge, we can't rely on *prior* reasonings (even though these might involve Bayesian reasoning). Real science works by investigating evidence.
So, how do we characterise the human tendency to jump to conclusions that have simply been supplied by their sense of normativity. Is their a previously described bias that covers this case?