New to LessWrong?

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 5:10 PM

Many argue that there are more slaves in the world today than in the 19th century. Yet because one’s political rivals cannot be delegitimized by being on the wrong side of slavery, few care to be active abolitionists anymore, compared to being, say, speech police.

This reminds me how the topic of female genital mutilation was treated by people "on the right side of history" around me.

After learning that it exists, not knowing much about the details yet...

"This is another example of a horrible thing men do to women! We must protest against this, and demand severe punishment for all men involved. Everyone, join the revolution!"

...after learning that it is actually something that women do to little girls...

"Uhm, it is a horrible thing that needs to be stopped, but punishment is not the way to do it. We need more discussion and education about the topic. We should not blame the women doing it, because they are merely victims of their culture. We need to change the culture!"

...after people "on the wrong side of the history" agreed that, indeed, cultures that cut little girls' genitals are horrible...

"Hey, white people are not allowed to judge other cultures negatively! Maybe they have perfectly valid cultural reasons for how they behave. Let's accuse everyone who complains about female genital mutilation of intolerance and racism! And among ourselves, let's not mention this topic anymore, lest we provide a useful argument for our enemies."

So at the end no one actually cared about the mutilated girls, when they turned out to be useless as an argument against the local political opponents.

The more biased away from neutral truth, the better the communication functions to affirm coalitional identity, generating polarization in excess of actual policy disagreements. Communications of practical and functional truths are generally useless as differential signals, because any honest person might say them regardless of coalitional loyalty.

ie, people use strange beliefs as tribal shibboleths.

It seems like the key problem described here is that coalitions of rational people, when they form around scientific propositions, cause the group to become non-scientific out of desire to support the coalition. The example that springs to my mind is climate change, where there is social pressure for scientific-minded people (or even those who just approve of science) to back the rather specific policy of reducing greenhouse gas emissions rather than to probe other aspects of the problem or potential solutions and adaptations.

I wonder if we might solve problems like this by substituting some rational principle that is not subject to re-evaluation. Ultimate goals (CEV, or the like) would fit the bill in principle, but in practice, even if enough people could agree on them, I suspect they are too vague and remote to form a coalition around. The EA movement may be closer to succeeding, where the key idea is not an ultimate goal but rather the general technique of quantitatively evaluating opportunities to achieve altruistic objectives in general. Still, it's difficult to extend a coalition like that to a broader population, since most people can't easily identify with it.

Perhaps the middle ground is to start with a goal that is controversial enough to distinguish coalition members from outsiders, but too vague to form a strong coalition around--say, aggregative consequentialism or something. Then find a clear practical implication of the goal that has the necessary emotional impact. As long as the secondary goal follows easily enough from the first goal that it won't need to be re-evaluated later on, the coalition can hold together and make progress toward the original goal without much danger of becoming irrational. Can't think of a good example for the sub-goal, though.