Why you should be very careful about trying to openly seek truth in any political discussion
1. Rationality considered harmful for Scott Aaronson in the great gender debate
In 2015, complexity theorist and rationalist Scott Aaronson was foolhardy enough to step into the Gender Politics war on his blog with a comment stating that extreme feminism that he bought into made him hate himself and try to seek ways to chemically castrate himself. The feminist blogoshere got hold of this and crucified him for it, and he has written a few followup blog posts about it. Recently I saw this comment by him on his blog:
2. Rationality considered harmful for Sam Harris in the islamophobia war
I recently heard a very angry, exasperated 2 hour podcast by the new atheist and political commentator Sam Harris about how badly he has been straw-manned, misrepresented and trash talked by his intellectual rivals (who he collectively refers to as the "regressive left"). Sam Harris likes to tackle hard questions such as when torture is justified, which religions are more or less harmful than others, defence of freedom of speech, etc. Several times, Harris goes to the meta-level and sees clearly what is happening:
3. Rationality considered harmful when talking to your left-wing friends about genetic modification
In the SlateStarCodex comments I posted complaining that many left-wing people were responding very personally (and negatively) to my political views.
One long term friend openly and pointedly asked whether we should still be friends over the subject of eugenics and genetic engineering, for example altering the human germ-line via genetic engineering to permanently cure a genetic disease. This friend responded to a rational argument about why some modifications of the human germ line may in fact be a good thing by saying that "(s)he was beginning to wonder whether we should still be friends".
A large comment thread ensued, but the best comment I got was this one:
One of the useful things I have found when confused by something my brain does is to ask what it is *for*. For example: I get angry, the anger is counterproductive, but recognizing that doesn’t make it go away. What is anger *for*? Maybe it is to cause me to plausibly signal violence by making my body ready for violence or some such.
Similarly, when I ask myself what moral/political discourse among friends is *for* I get back something like “signal what sort of ally you would be/broadcast what sort of people you want to ally with.” This makes disagreements more sensible. They are trying to signal things about distribution of resources, I am trying to signal things about truth value, others are trying to signal things about what the tribe should hold sacred etc. Feeling strong emotions is just a way of signaling strong precommitments to these positions (i.e. I will follow the morality I am signaling now because I will be wracked by guilt if I do not. I am a reliable/predictable ally.) They aren’t mad at your positions. They are mad that you are signaling that you would defect when push came to shove about things they think are important.
Let me repeat that last one: moral/political discourse among friends is for “signalling what sort of ally you would be/broadcast what sort of people you want to ally with”. Moral/political discourse probably activates specially evolved brainware in human beings; that brainware has a purpose and it isn't truthseeking. Politics is not about policy!
4. Takeaways
This post is already getting too long so I deleted the section on lessons to be learned, but if there is interest I'll do a followup. Let me know what you think in the comments!
I think that's what most people who were or want to be part of the rationalist community want to work on now. That's what Scott Alexander does full time with SSC and his comments. Even on LW despite the weird and dated rules, everyone wants to discuss this stuff and work on slowly figuring it out. I don't think anyone really cares how a 22 year old has reinterpreted EY's post on cognitive biases or some new version of AI risk(and I say that having put all my faith in 22 year old engineering kids saving the world).
I'll probably just post on it more now here, and see what happens.
yeah, you should do. I feel like knowing the key posts and ideas is helpful. For example West Hunter has a wide range of types of posts: some are goofing off and some are really important. Same with gnxp.