A prophet is without dishonor in his hometown
I'm reading the book "The Year of Living Biblically," by A.J. Jacobs. He tried to follow all of the commandments in the Bible (Old and New Testaments) for one year. He quickly found that
- a lot of the rules in the Bible are impossible, illegal, or embarassing to follow nowadays; like wearing tassels, tying your money to yourself, stoning adulterers, not eating fruit from a tree less than 5 years old, and not touching anything that a menstruating woman has touched; and
- this didn't seem to bother more than a handful of the one-third to one-half of Americans who claim the Bible is the word of God.
You may have noticed that people who convert to religion after the age of 20 or so are generally more zealous than people who grew up with the same religion. People who grow up with a religion learn how to cope with its more inconvenient parts by partitioning them off, rationalizing them away, or forgetting about them. Religious communities actually protect their members from religion in one sense - they develop an unspoken consensus on which parts of their religion members can legitimately ignore. New converts sometimes try to actually do what their religion tells them to do.
I remember many times growing up when missionaries described the crazy things their new converts in remote areas did on reading the Bible for the first time - they refused to be taught by female missionaries; they insisted on following Old Testament commandments; they decided that everyone in the village had to confess all of their sins against everyone else in the village; they prayed to God and assumed He would do what they asked; they believed the Christian God would cure their diseases. We would always laugh a little at the naivete of these new converts; I could barely hear the tiny voice in my head saying but they're just believing that the Bible means what it says...
How do we explain the blindness of people to a religion they grew up with?
Cultural immunity
Europe has lived with Christianity for nearly 2000 years. European culture has co-evolved with Christianity. Culturally, memetically, it's developed a tolerance for Christianity. These new Christian converts, in Uganda, Papua New Guinea, and other remote parts of the world, were being exposed to Christian memes for the first time, and had no immunity to them.
The history of religions sometimes resembles the history of viruses. Judaism and Islam were both highly virulent when they first broke out, driving the first generations of their people to conquer (Islam) or just slaughter (Judaism) everyone around them for the sin of not being them. They both grew more sedate over time. (Christianity was pacifist at the start, as it arose in a conquered people. When the Romans adopted it, it didn't make them any more militaristic than they already were.)
The mechanism isn't the same as for diseases, which can't be too virulent or they kill their hosts. Religions don't generally kill their hosts. I suspect that, over time, individual selection favors those who are less zealous. The point is that a culture develops antibodies for the particular religions it co-exists with - attitudes and practices that make them less virulent.
I have a theory that "radical Islam" is not native Islam, but Westernized Islam. Over half of 75 Muslim terrorists studied by Bergen & Pandey 2005 in the New York Times had gone to a Western college. (Only 9% had attended madrassas.) A very small percentage of all Muslims have received a Western college education. When someone lives all their life in a Muslim country, they're not likely to be hit with the urge to travel abroad and blow something up. But when someone from an Islamic nation goes to Europe for college, and comes back with Enlightenment ideas about reason and seeking logical closure over beliefs, and applies them to the Koran, then you have troubles. They have lost their cultural immunity.
I'm also reminded of a talk I attended by one of the Dalai Lama's assistants. This was not slick, Westernized Buddhism; this was saffron-robed fresh-off-the-plane-from-Tibet Buddhism. He spoke about his beliefs, and then took questions. People began asking him about some of the implications of his belief that life, love, feelings, and the universe as a whole are inherently bad and undesirable. He had great difficulty comprehending the questions - not because of his English, I think; but because the notion of taking a belief expressed in one context, and applying it in another, seemed completely new to him. To him, knowledge came in units; each unit of knowledge was a story with a conclusion and a specific application. (No wonder they think understanding Buddhism takes decades.) He seemed not to have the idea that these units could interact; that you could take an idea from one setting, and explore its implications in completely different settings. This may have been an extreme form of cultural immunity.
We think of Buddhism as a peaceful, caring religion. A religion that teaches that striving and status are useless is probably going to be more peaceful than one that teaches that the whole world must be brought under its dominion; and religions that lack the power of the state (e.g., the early Christians) are usually gentler than those with the power of life and death. But much of Buddhism's kind public face may be due to cultural norms that prevent Buddhists from connecting all of their dots. Today, we worry about Islamic terrorists. A hundred years from now, we'll worry about Buddhist physicists.
Reason as immune suppression
The reason I bring this up is that intelligent people sometimes do things more stupid than stupid people are capable of. There are a variety of reasons for this; but one has to do with the fact that all cultures have dangerous memes circulating in them, and cultural antibodies to those memes. The trouble is that these antibodies are not logical. On the contrary; these antibodies are often highly illogical. They are the blind spots that let us live with a dangerous meme without being impelled to action by it. The dangerous effects of these memes are most obvious with religion; but I think there is an element of this in many social norms. We have a powerful cultural norm in America that says that all people are equal (whatever that means); originally, this powerful and ambiguous belief was counterbalanced by a set of blind spots so large that this belief did not even impel us to free slaves or let women or non-property-owners vote. We have another cultural norm that says that hard work reliably and exclusively leads to success; and another set of blind spots that prevent this belief from turning us all into Objectivists.
A little reason can be a dangerous thing. The landscape of rationality is not smooth; there is no guarantee that removing one false belief will improve your reasoning instead of degrading it. Sometimes, reason lets us see the dangerous aspects of our memes, but not the blind spots that protect us from them. Sometimes, it lets us see the blind spots, but not the dangerous memes. Either of these ways, reason can lead an individual to be unbalanced, no longer adapted to their memetic environment, and free to follow previously-dormant memes through to their logical conclusions. To paraphrase Steve Weinberg: For a smart person to do something truly stupid, they need a theory.
The vaccines?
How can you tell when you have removed one set of blind spots from your reasoning without removing its counterbalances? One heuristic to counter this loss of immunity might be to be very careful when you find yourself deviating from everyone around you. But most people already do this too much
Another heuristic is to listen to your feelings. If your conclusions seem repulsive to you, you may have stripped yourself of cognitive immunity to something dangerous.
Perhaps the most-helpful thing isn't to try to prevent memetic immune disorder, but to know that it could happen to you.
So, there is a hidden component in levels of belief: together with stated level of certainty, bland "truthiness" of a statement, there is also a procedural perspective, with the statement applying with different power in different contexts. This more nuanced level of belief is harder to see and harder to influence: take "belief in belief" as a special case; on one hand there is certainty, on the other it refuses to speak of the real world.
Compartmentalization seems to be the default method for managing "quoted" beliefs: instead of keeping track of what evidence there is for what, just start directly believing everything, but in narrow contexts. If the facts check out, collections of new pieces of knowledge pass coherence checks and gain influence. Insanity remains in the quarantine indefinitely, and even if within its crib it calls the shots, it is a mistake to interpret it as accepted by the person as a whole. When an aspect of most people is insane, it is so by design, part of the never-ending process of reevaluation.
This mechanism is also probably what's responsible for people not even caring to distinguish positive assertions from the negative ones. The natural mode is to just amass impressions of facts, by adherence to each other rather than in their original forms, with levels of certainty simply reporting how well the new statement fits in.