When you’re communicating with people who know more than you, you have two options. You can accept their greater state of knowledge, causing you to speak more honestly about the pertinent topics. Or, you could reject their credibility, claiming that they really don’t know more than you. Many people who know less than you both may believe you over them.
A third option is to claim epistemic learned helplessness. You can believe someone knows more than you, but reject their claims because there are incentives to deceive. It's even possible to openly coordinate based on this. This seems like something I've seen people do, maybe even frequently. I can't think of anything specific, but one method would be to portray the more knowledgeable person as "using their power [in the form of knowledge] for evil".
It's a good point.
The options are about how you talk to others, rather than how you listen to others. So if you talk with someone who knows more than you, "humble" means that you don't act overconfidently, because they could call you out on it. It does not mean that you aren't skeptical of what they have to say.
I definitely agree that you should often begin skeptical. Epistemic learned helplessness seems like a good phrase, thanks for the link.
One specific area I could see this coming up is when you have to debate someone you are sure is wrong, but has way more practice debating. They may know all the arguments and counter-arguments, and would destroy you in any regular debate, but that doesn't mean you should trust them, especially if you know there are better experts on the other side. You could probably find great debaters on all controversial topics, on both sides.
Claim 2: It’s difficult to judge where on the curve people are who are higher than you, absent of sophisticated systems to support this.
Eliezer mentioned something like this in The Level Above Mine.
This nicely explains why I feel so embarrassed when I learn that someone I'm talking with is more knowledgeable than I thought. I wonder how to avoid subconscious overconfidence- / humility-projecting.
It might work to add a TAP for thinking "if this person were much more/less knowledgeable than me, would I have the same presentation in this conversation?"
That's a good point. My communication changes a lot too and it's one reason why I'm often reluctant to explain ideas in public rather than in private; it's much harder to adjust the narrative and humility-level.
To be a bit more specific; I think there are multiple reasons why you would communicate in different ways to people on different levels of knowledge. One is because you could "get away with more" around people who know less than you. But another is that you would expect people at different parts of the curve to know different things and talk in different ways, so if you just optimized for their true learning, the results would be quite different.
As I read through, the core model fit well with my intuition. But then I was surprised when I got to the section on religious schisms! I wondered why we should model the adherents of a religion as trying to join the school with the most 'accurate' claims about the religion.
On reflection, it appears to me that the model probably holds roughly as well in the religion case as the local radio intellectual case. Both of those are examples of "hostile" talking up. I wonder if the ways in which those cases diverge from pure information sharing explains the difference between humble and hostile.
In particular, perhaps some audiences are looking to reduce cognitive dissonance between their self-image as unbiased on the one hand and their particular beliefs and preferences on the other. That leaves an opening for someone to sell reasonableness/unbiasedness self-image to people holding a given set of beliefs and preferences.
Someone making reasonable counterarguments is a threat to what you've offered, and in that case your job is to provide refutation, counterargument and discredit so it is easy for that person's arguments to be dismissed (through a mixture of claimed flaws in their arguments and claimed flaws in the person promoting them). This would be a 'hostile' talking up.
Also, we should probably expect to find it hard to distinguish between some hostile talking ups and overconfident talking downs. If we could always distinguish, hostile talking up is a clear signal of defeat.
Good points;
I would definitely agree that people are generally reluctant to blatantly deceive themselves. There is definitely some cost to incorrect beliefs, though it can vary greatly in magnitude depending on the situation.
For instance, just say all of your friends go to one church, and you start suspecting your local minister of being less accurate than others. If you actually don't trust them, you could either pretend you do and live as such, or be honest and possibly have all of your friends dislike you. You clearly have a strong motivation to believe something specific here, and I think generally incentives trump internal honesty.[1]
On the end part, I don't think that "hostile talking up" is what the hostile actors want to be seen as doing :) Rather, they would be trying to make it seem like the people previously above them are really below them. To them and their followers, they seem to be at the top of their relevant distribution.
1) There's been a lot about politics being tribal being discussed recently, and I think it makes a lot of pragmatic sense. link
I don't think it's inherently difficult to tell the difference between someone who is speaking N levels above you and someone who is speaking N+1 levels above you. The one speaking at a higher level is going to expand on all of the things they describe as errors, giving *more complex* explanations.
The difficulty is that it's impossible to tell if someone who is higher level than you is wrong, or telling a sophisticated lie, or correct, or some other option. The only way to understand how they reached their conclusion is to level up to their level and understand it the hard way.
There's a related problem, where it's nigh impossible to tell if someone who is actually at level N but speaking at level N+X is making shit up completely unless you are above the level they are (and can spot errors in their reasoning).
Take a very simple case: A smart kid explaining kitchen appliances to a less smart kid. First he talks about the blender, and how there's an electric motor inside the base that makes the gear thingy go spinny, and that goes through the pitcher and makes the blades go spinny and chop stuff up. Then he talks about the toaster, and talks about the hot wires making the toast go, and the dial controls the timer that pops the toast out.
Then he goes +x over his actual knowledge level, and says that the microwave beams heat radiation into the food, created by the electronics, and that the refrigerator uses an 'electric cooler' (the opposite of an electric heater) to make cold that it pumps into the inside, and the insulated sides keep it from making the entire house cold.
Half of those are true explanations, and half of those are bluffs, but someone who is barely has the understanding needed to verify the first two won't have the understanding needed to refute the last two. If someone else corrects the wrong descriptions, said unsophisticated observer would have to use things other than the explanation to determine credibility (in the toy cases given, a good explanation could level up the observer enough to see the bluff, but in the case of +5 macroeconomics that is impractical). If the bluffing actor tries to refute the higher-level true explanation, they merely need to bluff more; people high enough level to see the bluff /already weren't fooled/, and people of lower level see the argument see the higher level argument settle into an equilibrium or cycle isomorphic to all parties saying "That's not how this works, that's not how anything works; this is how that works", and can only distinguish between them by things other than the content of what they say (bias, charisma, credentials, tribal affiliation, or verified track records are all within the Overton Window for how to select who to believe).
I guess this applies mostly to politicized topics? I also like Robin's advice to "pull the rope sideways".
Perhaps if you have a large definition of politicized. To me this applies to many areas where people are overconfident (which happens everywhere). Lots of entrepreneurs, academics, "thought leaders", and all the villains of Expert Political Judgement.
To give you a very different example, take a tour guide of San Francisco. They probably know way more about SF history than the people they teach. If they happen to be overconfident for different reasons, no one is necessarily checking them. I would imagine that if they ever give tour guides to SF history experts, their stated level of confidence in their statements would be at least somewhat different.
It's a frequency distribution ordered by amount of knowledge on a topic. The Y axis for a distribution is frequency, but the units aren't very useful for these (the shape is the important part, because it's normalized to total 1).
I'm generally hesitant to get into this line of thinking (and others like it) because knowledge is such a thoroughly multi-dimensional space and usually the ends people are looking to move towards with these kinds of models aren't terribly realistic.
I think the true answer is that it's both hard to know what anyone knows about a given field and it also very rarely matters. It reminds me of the talk "Superintelligence: The Idea That Eats Smart People" -- there's a habit among intellectuals, academics, and learned professionals (usually in that order) to get so caught up in their work that they think it intrinsically matters, when, really, nothing does (at least not to everyone).
You can be very "knowledgeable" in a field, double down on the wrong side of a schism, and then see years of your work become nearly worthless when your mental framework is empirically proven wrong. That work might also turn out to be useful again decades later for secondary or even unrelated reasons; when and where are you more or less knowledgeable than your peers here?
And to circle back to the Superintelligence talk: we as humans are very adept at finding ways to survive and thrive despite all kinds of uncertainty and threats, and one of the best tools we have for that is ignoring things until they're a major problem for us. In your radio intellectual example, I'd put forward that those kinds of situations arise because the presence or absence of such figureheads (or demagogues) doesn't generally matter to most people most of the time. When such people become burdensome and overbearing in their demands, they are ousted--entire governments have been bloodily overthrown from within and without for such reasons. That feels inefficient to the person who thinks such fields and their heads matter, but it's generally good enough.
My last point would just be that if it's really hard to know how much more knowledgeable than you someone is, how can you have confidence that someone knows more about specific sub-niche X than you, and not just more about overall field Y? Einstein probably knew more about Physics on the whole than just about anyone outside of a group that could fit into a single lecture hall, but if he looked at a suspension bridge's plans and wanted to "make corrections" I'd probably stick with a seasoned civil engineer unless they both agreed on review. The engineer would probably know more about the physics of suspension bridges in their home country than Einstein; if the latter was able to convince me otherwise that's a question of societal status and political skill in general.
In response to your last point, I didn't really get into differences between similar areas of knowledge in this post, it definitely becomes a messy topic. I'd definitely agree that for "making a suspension bridge", I'd look at people who seem to have knowledge in "making suspension bridges" than knowledge in "physics, in general."
A Distribution of Knowlege
If one were to make a distribution of the amount of knowledge different people have about, say, macroeconomics, I would suspect the distribution to be somewhat lognormal; they would have tails to both ends, but be very skewed to the right. Most people have almost no knowledge of macroeconomics, some have a bit, then there is a long tail of fewer and fewer people who make up the experts.
The above graph doesn’t exactly resemble what I’d expect for macroeconomics but acts as a rough heuristic. The large numbers represent halvings of the remaining percentiles (3/4th, 7/8th, 15/16th, etc).[1]
I’m going to posit the following claims:[2]
Claim 1: It’s easy to judge where on the curve people are who are lower than you.
Claim 2: It’s difficult to judge where on the curve people are who are higher than you, absent of sophisticated systems to support this.
Given these, let’s imagine a few situations:
Overconfident talking down, humble or hostile talking up
I think the answers I’d expect from these questions can be summarized in the phrase “Overconfident talking down, humble or hostile talking up.”
When you’re communicating with people who know less than you, and you have little accountability from people who know more, then you generally have the option of claiming to be more knowledgeable than you are, and lying in ways that are useful to you.
When you’re communicating with people who know more than you, you have two options. You can accept their greater state of knowledge, causing you to speak more honestly about the pertinent topics. Or, you could reject their credibility, claiming that they really don’t know more than you. Many people who know less than you both may believe you over them.
There are many examples of this. One particularly good one may be the history of schisms in religious organizations. Religious authorities generally know a lot more about their respective religions than the majority of citizens. Each authority has a choice; they could either accept the knowledge of the higher authorities, or they could reject the higher authorities. If they reject above authority, they would be incentivized to discredit that authority and express overconfidence in their own new beliefs. If they succeed, some followers would believe them, giving them both the assumption of expertise and also the flexibility of not having to be accountable to other knowledgeable groups. If they both defect on their previous authorities and fail, then they may wind up in a very poor position, so after defecting it's very important to ensure that their existing audience gives them full support.
The Economics of Knowledge Signaling
In slightly more economic terms, one could say that there are strong signals going up the chain of knowledge (from the nonexperts to the experts), and weak signals going down it. The market for knowledgeable expertise is one with relatively low transparency and typical incentives to lie and deceive, similar to the markets for lemons.
I'm not claiming with this that all of the overconfidence and discrediting is knowingly dishonest.[3] I'm also not claiming that this is original; much is quite obvious and parts are definitely studied. That said, I do get the impression that the science of signaling is still pretty overlooked (much of this is from Robin Hanson), and this is one area I think may not be well understood as a holistic economic system.
Finally, I'm reminded of the old joke:
One may wonder what incentives seem to lead to such heartfelt but predictably frequent divisions.