The deferrer will copy beliefs mistakenly imputed to the deferred-to that would have explained the deferred-to's externally visible behavior. This pushes in the direction opposite to science because science is the way of making beliefs come apart from their pre-theoretical pragmatic implications.
Clarificaton request, this means that in addition to the stuff that the deferred-to opines, leaners will take as advice stuff the author didn't mean to be opining?
I don't know whether the high-mindedness magisteria matters. I question whether that activity is actually philosophy rather than science (I guess there is a link through "natural philosophy"). Seem I don't know da way.
What I mean is, suppose the deferred-to has some belief X. This X is a refined, theoretically consilient belief to some extent, and to some extent it isn't, but is instead pre-theoretic; intuitive, pragmatic, unreliable, and potentially inconsistent with other beliefs. What happens when the deferred-to takes practical, externally visible action, which is somehow related to X? Many of zer other beliefs will also play a role in that action, and many of those beliefs will be to a large extent pre-theoretical. Pre-theoreticalness is contagious, in action: theoretical refinement, to be expressed in action, asks for a rethinking of previously used protocols, so the easiest way to act on X is to use what is functionally a more pre-theoretical version of X.
So if the deferrer is imputing beliefs based on action, they'll in general impute a more pre-theoretical belief; and they'll place extra drag on their own processes of theoretical refinement. Like, when they notice contradictions, instead of rethinking their concepts and assumptions, they'll avoid doing so, because that would contradict the apparent belief implied by the deferred-to's behavior.
(Sorry this is isn't more clear or concrete. I think the history of phlogiston is an example of some of this, where two theories are nearly identical in terms of pre-theoretic behavioral implications / expectations (e.g., both theories say that fire will be snuffed out by being in an enclosed space); but then by drawing out more implications, the threat of inconsistency forces one theory to be more and more complicated.)
Endorsed. I think what you should do about deferral depends on what role you wish to play in the research community. Knowledge workers intending to make frontier progress should be especially skeptical of deferring to others on the topics they intend to specialise in. That may mean holding off on deferring on a wide range of topics, because curious scientists should keep a broad horizon early on. Deferring early on could lead to habits-of-thought that can be hard to override later on (sorta like curse of knowledge), and you might miss out on opportunities to productively diverge or even discover a flaw in the paradigm.
Explorers should mostly defer on value of information, not object-level beliefs. When someone I trust says they're confident in some view I'm surprised by, I'm very reluctant to try to tweak my models to output what I believe they believe; instead I make a note to investigate what they've investigated, using my own judgment of things all the way through.
Yeah, VoI seems like a better place to defer. Another sort of general solution, which I find difficult but others might find workable, is to construct theories of other perspectives. That lets there be sort of unlimited space to defer: you can do something that looks like deferring, but is more precisely described as creating a bunch of inconsistent theories in your head, and deferring to people about what their theory is, rather than what's true. (I run into trouble because I'm not so willing to accept others's languages if I don't see how they're using words consistently.)
Null pointer exceptions and segfaults are the most obvious risks. Oh wait - those are dangers of dereference.
While I don’t disagree that delegating your beliefs (my preferred framing; deference is passive and hierarchy-based, delegation is active and implies an intentional choice of who and what) has done cost, it also has massive benefits in efficiency and probably correctness, as long as you delegate to mainstream thinkers rather than contrarians.
[Written September 02, 2022. Note: I'm likely to not respond to comments promptly.]
Sometimes people defer to other people, e.g. by believing what they say, by following orders, or by adopting intents or stances. In many cases it makes sense to defer, since other people know more than you about many things, and it's useful to share eyes and ears, and coordination and specialization are valuable, and one can "inquisitively defer" to opinions by taking them as challenges to investigate further by trying them out for oneself. But there are major issues with deferring, among which are:
Together, these dynamics make it so that deferral-based opinions are under strong pressure to not function as actual beliefs that can be used to make successful plans and can be ongoingly updated to track reality. So I recommend that people
Not to throw away arguments or information from other people, or to avoid investigating important-if-true claims, but to think as though thinking matters.