I notice that if you want to persuade me away from a position, it sometimes works to have me talk with two kinds of people: 1) people who have good reasons for disagreeing with my position, and 2) people who agree with my position for similar reasons and hold it even more strongly than I do.

In both cases, the difference in opinion forces me to re-examine my reasons for believing in something, but the direction of the examination is different. Case 1 makes me think "are these criticisms valid, or do I (should I) support this position because of some reason that the criticisms do not take into account?". Case 2 makes me think "this person believes in this thing for basically the same reasons that I do, so why haven't those reasons pushed me to a similar extreme? Do I actually have unacknowledged reasons for doubting the validity of those reasons, which would deserve further consideration?"

In case 1, I am being presented with criticisms that came from outside my own thought process. In case 2, I am searching my own thought process for criticisms that have been generated within it. So the source of the criticism is either external or internal, respectively. A combination of both may prove decisive in situations where just one isn't enough.

Now the question is, are there reliable ways for inducing one of the cases in situations where only the other is present, and I have reason to suspect that I'm being overconfident about something?

New Comment
5 comments, sorted by Click to highlight new comments since: Today at 6:36 PM

I agree in general principle but note that for evenness, you should also be exposing yourself to people who have good reason to agree with your position, and people who disagree with that position more strongly than you do. There's also a Reversed Stupidity effect to be wary of, and a discipline to distinguish the ick factor of a fallacious argument from the separate proposition that reality goes the other way.

I think there is a 3rd case when considering arguments & positions. Namely, "This person has arrived at a conclusion I agree with... but they have done so because of reasons significantly different from my own."

Sure. I didn't mention that one because it isn't likely to reduce my confidence in a position.

I mentioned it specifically because it often does reduce my confidence in my position to encounter such a case -- or, at least it compels me to go through the process of re-evaluating my position to look for weaknesses.

I think we sometimes operate under the assumption that people who arrive at the same conclusion use the same (or very similar) reasoning. But this is not always the case.

If you encounter someone who arrives at your conclusion using different premises, then (1) his premises are faulty, (2) yours are faulty, (3) you've each found unique solid arguments or (4) you've each been sufficiently misguided, each constructed faulty premises, and both need an update.

Re overconfidence. As I mentioned recently, sometimes what makes me doubt my views in a certain area is when I see someone holding the same views for the same reasons in the same area (say, AGI feasibility), but whose views and reasons for holding them in other areas I find to have low credibility (say, animal welfare). My train of thought is something like this:

  1. Everyone is biased to some degree and neither of us is an exception.

  2. There must be a reason why our views on some topic diverge, even though we seem to have access to the same information and have thought seriously about the topic.

  3. One likely reason is that we have different priors/assign different weight to the same information, which is a polite way of saying that we have different biases.

  4. Independently holding similar views for similar non-trivial reasons probably means that we are similarly intelligent, so simply adopting or dismissing the other person's views where they diverge from mine is a wrong thing to do.

  5. I can usually track the other person apparent bias in the divergent area, but all I honestly know is that this is a relative bias, given my views, since both of us have access to the same evidence.

  6. If I am honest with myself, I ought to admit that I could be the one biased, with some significant probability.

  7. The hard part is admitting that this probability can be close to 50% (I still don't alieve that).

  8. Significant relative bias in one area implies that similar biases can be present in other areas, including the one where we both hold similar views.

  9. This means assigning higher probability of me being biased in the area of agreement.

...Aaaand usually what happens after that is that I soon forget about it and regress to my original entrenched position.