Let’s say you have well-informed opinions on a variety of topics. Without information about your long term accuracy in each given area, how confident should you be in those opinions?

Here’s a quick heuristic, for any area where other people have well-informed opinions about the same topics; your confidence should be a function of the distance of your estimate from the average opinion, and the standard deviation of those opinions. I’ll call this the wisdom-of-crowds-confidence level, because it can be justified based on the empirical observation that the average of even uninformed guesses is typically a better predictor than most individual predictions.

Why does this make sense?

The Aumann agreement theorem implies that rational discussants can, given enough patience and introspection, pass messages about their justifications until they eventually converge. Given that informed opinions share most evidence, the differential between the opinions is likely due to specific unshared assumptions or evidence. If that evidence were shared, unless the vast majority of the non-shared assumptions were piled up on the same side, the answer would land somewhere near the middle. (This is why I was going to call the heuristic Aumann-confidence, but I don’t think it quite fits.)

Unless you have a strong reason to assume you are a privileged observer, trading on inside information or much better calibrated than other observers, there is no reason to expect this nonshared evidence will be biased. And while this appears to contradict the conservation of expected evidence theorem, it’s actually kind-of a consequence of it, because we need to update on the knowledge that there is unshared evidence leading the other person to make their own claim.

This is where things get tricky — we need to make assumptions about joint distributions on unshared evidence. Suffice it to say that unless we have reason to believe our unshared evidence or assumptions is much stronger than theirs, we should end up near the middle. And that goes back to a different, earlier assumption - that others are also well informed.

Now that we’ve laid out the framework, though, we can sketch the argument.

  1. We can expect that our opinion should shift towards the average, once we know what the average is, even without exploring the other people’s unshared assumptions and data. The distance it should shift depends on how good our assumptions and data are compared to theirs.
  2. Even if we have strong reasons for thinking that we understand why others hold the assumptions they do, they presumably feel the same way about us.
  3. And why do you think your unshared evidence and assumptions are so great anyways, huh? Are you special or something?

Anyways, those are my thoughts.

Comments?

New to LessWrong?

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 6:42 AM

any area where other people have well-informed opinions

It could be tricky to find out which areas are like this.

I mean, if I disagree with most people, I am probably going to suspect that their opinions are not well-informed, so I guess the core problem is assessing the "well-informedness" of people I disagree with.

For example, if I believe that someone is highly intelligent and has a lot of experience in X, and their explanation makes sense to me, or they are willing to listen to my arguments and then tell me where specifically did I make a mistake... I would be quite willing to move my opinion towards theirs. No controversy here.

But there are also situations where people have strong opinions for wrong reasons, and they consider themselves well-informed because they talked to each other and read each other's blogs or books.

It's strange for me to consider that you'd take the inside view when evaluating the informed-ness of others.

The usually correct solution is to realize that both your opinion and theirs is likely based on making some incorrect assumptions. You just don't know which - and so this is a useful quick heuristic