I haven't read through the whole post, but some of what you say about how deferring can go wrong reminded me of my older post: https://www.lesswrong.com/posts/cQLZLFZEwLpRzFLdw/uninformed-elevation-of-trust
where deferring to authority tends to result in the deferrer having more trust in a given fact or position or view than the person of authority themselves has. Mostly because the deferrer did not do the work of actually forming the view and understanding the caveats and limitations, sort of similar to what you talk about in "Deferring can be bad for learning" and "Deferring can interfere with belief formation". This is an extremely common pattern everywhere, and this site is by no way immune.
I tend to use the word "delegate" instead of "defer" for this. It reminds me that the final responsibility for outcomes remains with me, and I'm doing it in order to be "less wrong", rather than for social status reasons (which "deference" implies, to me at least).
(Cross-posted from the EA Forum)
Deferring is when you adopt someone else's view on a question over your own independent view (or instead of taking the time to form an independent view). You can defer on questions of fact or questions of what to do. You might defer because you think they know better (epistemic deferring), or because there is a formal or social expectation that you should go along with their view (deferring to authority).
Both types of deferring are important — epistemic deferring lets people borrow the fruits of knowledge; deferring to authority enables strong coordination. But they are two-edged. Deferring can mean that you get less chance to test out your own views, so developing mastery is slower. Deferring to the wrong people can be straightforwardly bad. And when someone defers without everyone understanding that's what's happening, it can cause issues. Similarly, unacknowledged expectations of deferral from others can cause problems. We should therefore learn when and how to defer, when not to, and how to be explicit about what we're doing.
Why deferring is useful
Epistemic deferring
Epistemic deferring is giving more weight to someone else's view than your own because you think they're in a position to know better. The opposite of epistemic deferring is holding one's own view.
Examples:
The case for epistemic deferring is simple: for most questions, we can identify someone (or some institution or group of people) whose judgement on the question would — if they were possessed of the facts we knew — be better than our own. So to the extent that
deferring will be correct.
Partial deferring
The degree to which (A) and (B) hold will vary with circumstance. It will frequently be the case that they partially hold; in this case it may be appropriate to partially defer, e.g.
Deferring to authority
Deferring to authority is adopting someone else's view because of a social contract to do so. Often deferring to authority happens on questions of what should be done — e.g. "I'm going to put this fire alarm up because [my boss / my client / the law] tells me to", or “I’m helping my friend cook dinner, so I’ll cut the carrots the way they want, even though I think this other way is better”.[1] The opposite of deferring to authority is acting on one's own conscience.
Deferring to authority — and the reasonable expectation of such deferring — enables groups of people to coordinate more effectively. Militaries rely on it, but so do most projects (large and small, but especially large). It's unreasonable to expect that everyone working on a large software project will have exactly the same views over the key top-level design choices, but it's better if there's some voice that can speak authoritatively, so everyone can work on that basis. If we collectively want to be able to undertake large ambitious projects, we’ll likely need to use deferring to authority as a tool.
Ways deferring goes wrong
Deferring without common knowledge of deferring is a risk factor for these issues (since it's less likely that anyone is going to spot and correct them).
Social deferring
Often there’s a lot of deferring within a group or community on a particular issue (i.e. both the person deferring and the person being deferred to are within the group, and the people being deferred to often have their own views substantially via deferring). This can lead to issues, for reasons like:
Ultimately we don’t have good alternatives to basing a lot of our beliefs on chains of deferral (there are too many disparate disciplines of expertise in the world to personally be fluent with knowing who are the experts to listen to in each one). But I think it’s helpful to be wary of ways in which it can cause problems, and we should feel relatively better about:
When & how to defer
Epistemic deferring
There's frequently a tension between on the one hand knowing that you can identify someone who knows more than you, and on the other hand not wanting to take the time to get answers from them, or wanting to optimize for your own learning rather than just the best answer for the question at hand.
Here are the situations where I think epistemic deferring is desirable:
Note: even when not deferring, asking for advice is often a very helpful move. You can consider the advice and let it guide your thinking and how to proceed without deferring to any of the advice-givers.[2]
Deferring to authority
Working out when to defer to authority is often simply a case of determining whether you want to participate in the social contract.
It's often good to communicate when you're deferring, e.g. tell your boss "I'm doing X because you told me to, but heads up that Y looks better to me". Sometimes the response will just be "cool"; at other times they might realize that you need to understand why X is good in order to do a good job of X (or that they need to reconsider X). In any case it's helpful to keep track for yourself of when you're deferring to authority vs have an independent view.
A dual question of when to defer to authority is when to ask people to defer to you as an authority. I think the right answer is "when you want someone to go on following the plan even if they’re not personally convinced". If you’re asking others to defer it’s best if you’re explicit about this. Vice-versa if you’re in a position of authority and not asking others to defer it’s good to be explicit that you want them to act on their own conscience. (People take cultural cues from those in positions of authority; if they perceive ambiguity about whether they should defer, it may be ambiguous in their own mind, which seems bad for the reasons discussed above.)
Deferring to authority in the effective altruism community
I think people are often reluctant to ask others to defer to their authority within EA. We celebrate people thinking for themselves, taking a consequentialist perspective, and acting on their own conscience. Deferring to authority looks like it might undermine these values. Or perhaps we'd get people who reluctantly "deferred to authority" while trying to steer their bosses towards things that seemed better to them.
This is a mistake. Deferring to authority is the natural tool for coordinating groups of people to do big things together. If we're unwilling to use this tool, people will use social pressure towards conformity of beliefs as an alternate tool for the same ends. But this is worse at achieving coordination[3], and is more damaging to the epistemics of the people involved.
We should (I think) instead encourage people to be happy taking jobs where they adopt a stance of "how can I help with the agenda of the people steering this?", without necessarily being fully bought into that agenda. This might seem a let down for individuals, but I think we should be willing to accept more "people work on agendas they're not fully bought into" if the alternatives are "there are a bunch of epistemic distortions to get people to buy into agendas" and "nobody can make bets which involve coordinating more than 6 people". People doing this can keep their eyes open for jobs which better fit their goals, while being able and encouraged to have their own opinions, and still having professional pride in doing a good job at the thing they're employed to do.
This isn't to say that all jobs in EA should look like this. I think it is a great virtue of the community that we recognise the power of positions which give people significant space to act on their own conscience. But when we need more coordination, we should use the correct tools to get that.
Meta-practices
My take on the correct cultural meta-practices around deferring:
Closing remarks
A lot of this content, insofar as it is perceptive, is not original to me; a good part of what I'm doing here is just trying to name the synthesis position for what I perceive to be strong pro-deferral and anti-deferral arguments people make from time to time. This draft benefited from thoughts and comments from Adam Bales, Buck Shlegeris, Claire Zabel, Gregory Lewis, Jennifer Lin, Linch Zhang, Max Dalton, Max Daniel, Raymond Douglas, Rose Hadshar, Scott Garrabrant, and especially Anna Salamon and Holden Karnofsky. I might edit later to tighten or clarify language (or if there are one or two substantive points I want to change).
Should anyone defer to me on the topic of deferring?
Epistemically — I've spent a while thinking about the dynamics here, so it's not ridiculous to give my views some weight. But lots of people have spent some time on this; I'm hoping this article is more helpful as a guide to let people understand things they already see than as something that needs people to defer to.
As an authority — not yet. But I'm offering suggestions for community norms around deferring. Norms are a thing which it can make sense to ask people to defer to. If my suggestions are well received in the discussion here, perhaps we'll want to make asks for deference to them at some point down the line.
Some less central examples of deferring to authority in my sense:
cf. https://www.lesswrong.com/posts/yeADMcScw8EW9yxpH/a-sketch-of-good-communication
At least “just using ideological conformity” is worse for coordination than “using ideological conformity + deference to authority”. After we’re using deference to authority well I imagine there’s a case that having ideological conformity as well would help further; my guess is that it’s not worth the cost of damage to epistemics.