Motivated reasoning/confirmation bias.
As Scott Alexander said in his review of Julia Galif's The Scout Mindset:
Of the fifty-odd biases discovered by Kahneman, Tversky, and their successors, forty-nine are cute quirks, and one is destroying civilization. This last one is confirmation bias.
He goes on to argue that this bias is the source of polarization in society, which is distorting our beliefs and setting us at each other's throats. How could someone believe such different things unless they're either really stupid or lying to conceal their selfishness? I think this is right, and I think it's at play even in the best rationalist communities like LessWrong. I think it's particularly powerful in difficult domains, like AGI prediction and alignment theory. When there's less real evidence, biases play a larger role.
I reached this conclusion independently while studying those and the remaining ~149 biases listed on Wikipedia at that point. You can get a little more rational by making your estimates carefully. That covers most of the biases. But the world is being destroyed by people believing what is comfortable to believe instead of what the evidence suggests. This is usually also what they already believe, so the definition of confirmation bias is highly overlapping with motivated reasoning.
I studied the brain basis of cognitive biases for four years while funded by an IARPA program; I thought it was more worthwhile than the rest of what we were doing in cognitive neuroscience, so I kept up with it as part of my research for the remaining four years I was in the field.
I think motivated reasoning is a better conceptual term for understanding what's going on, but let's not quibble about terminology. I'm going to mostly call it motivated reasoning, MR, but you can take almost everything I'm going to say and apply it to confirmation bias- because mostly it's comfortable to keep believing what we already do. We chose to believe it partly because it was comfortable, and now it fits with all of our other beliefs, so changing it and re-evaluating the rest of our connected beliefs is uncomfortable.
Wait, you're saying! I'm a rationalist! I don't just believe what's comfortable!
Yes, that's partly true. Believing in seeking truth when it's hard does provide some resistance to motivated reasoning. A hardcore rationalist actually enjoys changing their mind sometimes. But it doesn't confer immunity. We still have emotions, and it's still more comfortable to think that we're already right because we're good rationalists who've already discerned the truth.
There are two ways confirmation bias works. One is that it's easier to think of confirming evidence than disconfirming evidence. The associative links tend to be stronger. When you're thinking of a hypothesis you tend to believe, it's easy to think of evidence that supports it.
The stronger one is that there's a miniature Ugh field surrounding thinking about evidence and arguments that would disprove a belief you care about. It only takes a flicker of a thought to make the accurate prediction about where considering that evidence could lead: admitting you were wrong, and doing a bunch of work re-evaluating all of your related beliefs. Then there's a little unconscious yuck feeling when you try to pay attention to that evidence.
This is just a consequence of how the brain estimates the value of predicted outcomes and uses that to guide its decision-making, including its micro-decisions about what to attend to. I wrote a paper reviewing all of the neuroscience behind this, Neural mechanisms of human decision-making, but it's honestly kind of crappy based on the pressure to write for a super-specialized audience, and my reluctance at the time to speed up progress on brain-like AGI. So I recommend Steve Byrnes' valence sequence over that complex mess; it perfectly describes the psychological level, and he's basing it on those brain mechanisms even though he's not directly talking about them. And he's a better writer than I am.
Trapped priors is at least partly overlapping with confirmation bias. Or it could even just be strong priors. The issue is that everyone has seen different evidence and arguments - and we've very likely spent more time attending to evidence that supports our original hypothesis, because of the subtle push of motivated reasoning.
Motivated reasoning isn't even strictly speaking irrational. Suppose there's some belief that really doesn't make a difference in your daily life, like that there's a sky guy with a cozy afterlife, or which of two similar parties should receive your vote (which will almost never actually change any outcomes). Here the two definitions of rationality diverge: believing the truth is now at odds with doing what works. It will obviously work better to believe what your friends and neighbors believe, so you won't be in arguments with them and they'll support you more when you need it.
If we had infinite cognitive capacity, we could just believe the truth while claiming to believe whatever works. And we could keep track of all of the evidence instead of picking and choosing which to attend to.
But we don't. So motivated reasoning, confirmation bias, and the resulting tribalism (which happens when other emotions like irritation and outrage get involved in our selection of evidence and arguments) are powerful factors, even for a devoted rationalist.
The only remedy I know of is to cultivate enjoying being wrong. This involves giving up a good bit of one's self-concept as a highly intelligent individual. This gets easier if you remember that everyone else is also doing their thinking with a monkey brain that can barely chin itself on rationality.
Thanks for asking this question; it's a very smart question to ask. And I've been meaning to write about this on LW and haven't prioritized doing a proper job, so it's nice to have an excuse to do a brief writeup.
A bit of a pushback, if I may: confirmation bias/motivated reasoning themselves only arise because of an inherent, deep-seated, [fairly likely] genetically conditioned, if not unconscious sense that:
A. there is, in fact, a single source of ground truth even, if not especially, outside of regular, axiomatic, bottom-up, abstract, formalized representation: be it math [+] or politics [-]
B. it is, in fact, both viable and desirable, to affiliate yourself with any one/number of groups, whose culture/perspective/approach/outlook must fully represent the A: instead of an arbitrarily small, blind-sided to everything else, part of the underlying portion it is most familiar with itself
C. any single point/choice/decision/conclusion/action reached must, in itself, be inherently sensible enough to hold for an arbitrarily significant period of time, without any revision or consideration of the opposite/orthogonal perspective; this one, in turn, might itself stem from an assumption that:
D. the world must be either a [1] static entity, fully representable with an arbitrarily large set of beliefs, attitudes, and considerations; or a [2] dynamic, yet inherently mechanical, following the exact same static laws/rules/patterns in each and every aspect of itself: be it physics or society; these last ones can be safely assumed to be never-changing and, once "understood", always reinterpreted within the exact same light as in the original interpretation of the time
E. whatever the kind of entity it is, any particular snapshot of the linguistic and/or symbolic representation of it is, at every moment, fully capable of describing it, without coming up short within any single aspect of it: an assumption, if you will, that there no "3x+1 Conjectures" the limitations of our cognitive/representative tools in the present would not be able to figure out
Biology-wise, the B might be strong enough to easily overpower, without any of our conscious awareness, the rest of them. Yet even discounting that: the motivated reasoning and the desire to adhere to whatever stance has been reached already themselves stem, fundamentally, from the sheer human arrogance in regarding whatever was [conceived/perceived/assimilated/concluded] as fully sufficient both for what is in the present, as well for what will be yet, going forward.
That arrogance, in turn, anchors our cognition; which promptly short-circuits itself into whatever Weltanschauung our general A-E'sque attitude of the day lines up to, in attempt to save energy on rather costly and, given A to E, completely wasteful brain cycles. MR/CB is merely an effect of it all.
P.S. Two upticks from me, regardless. The links were much appreciated. Would gladly hear any of your additional thoughts on the matter in a fully-sized post/article/whatever you call it here.