We've all had arguments that seemed like a complete waste of time in retrospect. But at the same time, arguments (between scientists, policy analysts, and others) play a critical part in moving society forward. You can imagine how lousy things would be if no one ever engaged those who disagreed with them.
This is a list of tips for having "productive" arguments. For the purposes of this list, "productive" means improving the accuracy of at least one person's views on some important topic. By this definition, arguments where no one changes their mind are unproductive. So are arguments about unimportant topics like which Pink Floyd album is the best.
Why do we want productive arguments? Same reason we want Wikipedia: so people are more knowledgeable. And just like the case of Wikipedia, there is a strong selfish imperative here: arguing can make you more knowledgeable, if you're willing to change your mind when another arguer has better points.
Arguments can also be negatively productive if everyone moves further from the truth on net. This could happen if, for example, the truth was somewhere in between two arguers, but they both left the argument even more sure of themselves.
These tips are derived from my personal experience arguing.
Keep it Friendly
Probably the biggest barrier to productive arguments is the desire of arguers to save face and avoid publicly admitting they were wrong. Obviously, it's hard for anyone's views to get more accurate if no one's views ever change.
- Keep things warm and collegial. Just because your ideas are in violent disagreement doesn't mean you have to disagree violently as people. Stay classy.
- To the greatest extent possible, uphold the social norm that no one will lose face for publicly changing their mind.
- If you're on a community-moderated forum like Less Wrong, don't downvote something unless you think the person who wrote it is being a bad forum citizen (ex: spam or unprovoked insults). Upvotes already provide plenty of information about how comments and submissions should be sorted. (It's probably safe to assume that a new Less Wrong user who sees their first comment modded below zero will decide we're all jerks and never come back. And if new users aren't coming back, we'll have a hard time raising the sanity waterline much.)
- Err on the side of understating your disagreement, e.g. "I'm not persuaded that..." or "I agree that x is true; I'm not as sure that..." or "It seems to me..."
- If you notice some hypocrisy, bias, or general deficiency on the part of another arguer, think extremely carefully before bringing it up while the argument is still in progress.
Inquire about Implausible-Sounding Assertions Before Expressing an Opinion
If someone suggests something you find implausible, start asking friendly questions to get them to clarify and justify their statement. If their reasoning seems genuinely bad, you can refute it then.
As a bonus, doing nothing but ask questions can be a good way to save face if the implausible assertion-maker turns out to be right.
Be careful about rejecting highly implausible ideas out of hand. Ideally, you want your rationality to be a level where even if you started out with a crazy belief like Scientology, you'd still be able to get rid of it. But for a Scientologist to berid themselves of Scientology, they have to consider ideas that initially seen extremely unlikely.
It's been argued that many mainstream skeptics aren't really that good at critically evaluating ideas, just dismissing ones that seem implausible.
Isolate Specific Points of Disagreement
Stick to one topic at a time, until someone changes their mind or the topic is declared not worth pursuing. If your discussion constantly jumps from one point of disagreement to another, reaching consensus on anything will be difficult.
You can use hypothetical-oriented thinking like conditional probabilities and the least convenient possible world to figure out exactly what it is you disagree on with regard to a given topic. Once you've creatively helped yourself or another arguer clarify beliefs, sharing intuitions on specific "irreducible" assertions or anticipated outcomes that aren't easily decomposed can improve both of your probability estimates.
Don't Straw Man Fellow Arguers, Steel Man Them Instead
You might think that a productive argument is one where the smartest person wins, but that's not always the case. Smart people can be wrong too. And a smart person successfully convincing less intelligent folks of their delusion counts as a negatively productive argument (see definition above).
Play for all sides, in case you're the smartest person in the argument.
Rewrite fellow arguers' arguments so they're even stronger, and think of new ones. Arguments for new positions, even—they don't have anyone playing for them. And if you end up convincing yourself of something you didn't previously believe, so much the better.
If You See an Opportunity To Improve the Accuracy of Your Knowledge, Take It!
This is often called losing an argument, but you're actually the winner: you and your arguing partner both invested time to argue, but you were the only one who received significantly improved knowledge.
If you're worried about losing face or seeing your coalition (research group, political party, etc.) diminish in importance from you admitting that you were wrong, here are some ideas:
- Say "I'll think about it". Most people will quiet down at this point without any gloating.
- Just keep arguing, making a mental note that your mind has changed.
- Redirect the conversation, pretend to lose interest, pretend you have no time to continue arguing, etc.
Some of these techniques may seem dodgy, and honestly I think you'll usually do better by explaining what actually changed your mind. But they're a small price to pay for more accurate knowledge. Better to tell unimportant false statements to others than important false statements to yourself.
Have Low "Belief Inertia"
It's actually pretty rare that the evidence that you're wrong comes suddenly—usually you can see things turning against you. As an advanced move, cultivate the ability to update your degree of certainty in real time to new arguments, and tell fellow arguers if you find an argument of theirs persuasive. This can actually be a good way to make friends. It also encourages other arguers to share additional arguments with you, which could be valuable data.
One psychologist I agree with suggested that people ask
- "Does the evidence allow me to believe?" when evaluating what they already believe, but
- "Does the evidence compel me to believe?" when evaluating a claim incompatible with their current beliefs.
If folks don't have to drag you around like this for you to change your mind, you don't actually lose much face. It's only long-overdue capitulations that result in significant face loss. And the longer you put your capitulation off, the worse things get. Quickly updating in response to new evidence seems to preserve face in my experience.
If your belief inertia is low and you steel-man everything, you'll reach the super chill state of not having a "side" in any given argument. You'll play for all sides and you won't care who wins. You'll have achieved equanimity, content with the world as it actually is, not how you wish it was.
The whole point of studying formal epistemology and debiasing (major topics on this site) is to build the skill of picking out which ideas are more likely to be correct given the evidence. This should always be worked on in the background, and you should only be applying these tips in the context of a sound and consistent epistemology. So really, this problem should fall on the user of these tips - it's their responsibility to adhere to sound epistemic standards when conveying information.
As far as the issue of changing minds - there is sort of a continuum here, for instance I might have a great deal of strong evidence for something like, say, evolution. Yet there will be people for whom the inferential distance is too great to span in the course of a single discussion - "well, it's just a theory", "you can't prove it" etc.
Relevant to the climate example, a friend of mine who is doing his doctorate in environmental engineering at Yale was speaking to the relative of a friend who is sort of a 'naive' climate change denier - he has no grasp of how scientific data works nor does he have any preferred alternative theory he's invested in. He's more like the "well it's cold out now, so how do you explain that?" sort. My friend tried to explain attractors and long term prediction methods, but this was ineffective. Eventually he pointed out how warm the winter has been unusually this year, and that made him think a bit about it. So he exploited the other person's views to defend his position. However, it didn't correct the other person's epistemology at all, and left him with an equally wrong impression of the issue.
The problem with his approach (and really, in his defense, he was just looking to end the conversation) is that should that person learn a bit more about it, he will realize that he was deceived and will remember that the deceiver was a "global warming believer". In this particular case, that isn't likely (he almost certainly will not go and study up on climate science), but it illustrates a general danger in presenting a false picture in order to vault inferential distance.
It seems like the key is to first assess the level of inferential distance between you and the other person, and craft your explanation appropriately. The difficult part is doing so without setting the person up to feel cheated once they shorten the inferential distance a bit.
So, the difficulty isn't just in making it work better for correct positions (which has its own set of suggestions, like studying statistics and (good) philosophy of science), but also being extremely careful when presenting intermediate stories that aren't quite right. This latter issue disappears if the other person has close to the same background knowledge as you, and you're right that in such cases it can become fairly easy to argue for something that is wrong, and even easier to argue for something that isn't as well settled as you think it is (probably the bigger danger of the two), leading you to misrepresent the strength of your claim. I think this latter issue is much 'stickier' and particularly relevant to LW, where you see people who appear to be extremely confident in certain core claims yet appear to have a questionable ability to defend them (often opting to link to posts in the sequences, which is fine if you've really taken the time to work out the details, but this isn't always the case).