Follow-up to: Knowing About Biases Can Hurt People
See also: Fully General Counterargument (LW Wiki)
A fully general counterargument [FGCA] is an argument which can be used to discount any conclusion the arguer does not like.
With the caveat that the arguer doesn't need to be aware that this is the case. But if (s)he is not aware of that, this seems like the other biases we are prone to. The question is: Is there a tendency or risk to accidentally form FGCAs? Do we fall easily into this mind-trap?
This post tries to (non-exhaustively) list some FGCAs as well as possible countermeasures.
The List
Here is a list of my own making:
- My opponent is a clever arguer against whom I have no chance. This in itself may be a humble stance, but if used unconditionally in all cases is an FGCA. See also bottom line of this post.
- Fallacies and biases - My opponent is just showing that (s)he falls prey to fallacies or biases. Thus I do not need to deal with any argument in particular.
- Specific sub-cases/examples:
- Experts are fallacious, thus you can't trust them (the first example in Knowing About Biases)
- You see only confiming evidence and ignore the negative evidence I provide.
- You are overconfident. I don't need to deal with your overconfidence until you became more humble.
- You think everybody thinks like yourself and should trivially understand what you suggest.
- Vagueness - real-life aspects do not admit precise definitions and this vagueness propagates into all arguments.
- Words are ambiguous and all definitions refer back to words.
- The fallacy of grey i.e. the belief that because nothing is certain, everything is equally uncertain.
- Life is complicated. There are so many dependencies in real complex life that factual judgements cannot be made.
- Nihilism - Life has no meaning and therefore any human endeavor - including arguments are meaningless.
- everybody is entitled to his own opinon - I don't like your opinion but you may have your own. It's not important that we differ.
- Humans are different. Your argument applies only to you because you are special.
- Is so! ('I scream otherwise'). I am emotionally attached to my view and I will scream/run away/use ad hominem if you you don't get how important that is to me!
- Applying hard science blindly. This yields true but over-general arguments which thus do not contribute to the actual question but just appear to answer it. Examples:
- Evolution. You, your argument, the thing you are arguing about has evolved and exists just because of evolution, not because it is true or a valid argument.
- Energetically more favourable. The state you are arguing for exists because it is energetically more efficient and therefore more likely and more stable, not because of other reasons you might be giving.
- Meta: The opponent is using fully general counterarguments.
Do you now some more? Into what clusters do these FGCAs fall?
Self-sealing Belief
Why do we use FGCAs? One reason may be when we are arguing from within a self-sealing belief:
Self-sealing beliefs are those where you're erroneously convinced that some desirable result is caused by taking some particular kind of action. Subsequent failure of the desirable result to occur is not used as disconfirming evidence that you're wrong to be convinced that way, but is instead used as evidence of a need for more of that action.
Preventive Action
What are known ways to avoid FGCAs?
One specific method against this mind trap is being humbly gullible.
It strikes me that if you start off too gullible you begin with an important skill: you already know how to change your mind. In fact, changing your mind is in some ways your default setting if you're gullible. And considering that like half the freakin sequences were devoted to learning how to actually change your mind, starting off with some practice in that department could be a very good thing.
Another is to practice Steelmanning as long as you avoid the dangers of steelmanning. Especially applicable is Steelmanning Inefficiency.
More general advice can of course be found in the Twelve Virtues of Rationality. See also the concise and improved versions.
I like this idea, and I'd like to see it developed further. I don't see any reason why FGCAs shouldn't be catalogued and learned alongside logical fallacies for the same reasons.
I guess the important distinction would be that certain FGCAs can be used non-fallaciously, and some of these seem to have valid use-cases, like pointing out confirmation bias and mind-projection fallacy. Others are fallacious in their fully-general form, but have valid uses in their non-fully-general forms, so it is important to distinguish these. (e.g. pointing out vagueness or that something is too complicated and has too many dependencies for a given argument to have much weight.)
Great post!
I apologize for mentioning this, but there were a lot of typos in this, which made it a bit hard to read. I want to link this to a few friends who are not LWers, but when I am not familiar with the source of something, typos make me question the credibility of the author (they also provide an easy excuse to discount things people don't want to hear). I don't want that to happen when I show people, so I figured I'd help you out if you feel like cleaning it up a bit. Here's a quick list I put together for you:
Awesome. Thank you for the very actionable response.
Typos fixed.