Follow-up to: Knowing About Biases Can Hurt People

See also: Fully General Counterargument (LW Wiki)

fully general counterargument [FGCA] is an argument which can be used to discount any conclusion the arguer does not like.

With the caveat that the arguer doesn't need to be aware that this is the case. But if (s)he is not aware of that, this seems like the other biases we are prone to. The question is: Is there a tendency or risk to accidentally form FGCAs? Do we fall easily into this mind-trap? 

This post tries to (non-exhaustively) list some FGCAs as well as possible countermeasures.

The List

Here is a list of my own making:

  • My opponent is a clever arguer against whom I have no chance. This in itself may be a humble stance, but if used unconditionally in all cases is an FGCA. See also bottom line of this post.
  • Fallacies and biases - My opponent is just showing that (s)he falls prey to fallacies or biases. Thus I do not need to deal with any argument in particular.
  • Specific sub-cases/examples:
    • Experts are fallacious, thus you can't trust them (the first example in Knowing About Biases)
    • You see only confiming evidence and ignore the negative evidence I provide.
    • You are overconfident. I don't need to deal with your overconfidence until you became more humble.
    • You think everybody thinks like yourself and should trivially understand what you suggest.
  • Vagueness - real-life aspects do not admit precise definitions and this vagueness propagates into all arguments. 
  • Words are ambiguous and all definitions refer back to words.
  • The fallacy of grey i.e. the belief that because nothing is certain, everything is equally uncertain.
  • Life is complicated. There are so many dependencies in real complex life that factual judgements cannot be made.
  • Nihilism - Life has no meaning and therefore any human endeavor - including arguments are meaningless. 
  • everybody is entitled to his own opinon - I don't like your opinion but you may have your own. It's not important that we differ.
  • Humans are different.  Your argument applies only to you because you are special.
  • Is so! ('I scream otherwise'). I am emotionally attached to my view and I will scream/run away/use ad hominem if you you don't get how important that is to me!
  • Applying hard science blindly. This yields true but over-general arguments which thus do not contribute to the actual question but just appear to answer it. Examples:
    • Evolution. You, your argument, the thing you are arguing about has evolved and exists just because of evolution, not because it is true or a valid argument.
    • Energetically more favourable. The state you are arguing for exists because it is energetically more efficient and therefore more likely and more stable, not because of other reasons you might be giving.
  • Meta: The opponent is using fully general counterarguments.

Do you now some more? Into what clusters do these FGCAs fall?

Self-sealing Belief

Why do we use FGCAs? One reason may be when we are arguing from within a self-sealing belief:

Self-sealing beliefs are those where you're erroneously convinced that some desirable result is caused by taking some particular kind of action. Subsequent failure of the desirable result to occur is not used as disconfirming evidence that you're wrong to be convinced that way, but is instead used as evidence of a need for more of that action.

See e.g. here and here.

Preventive Action

What are known ways to avoid FGCAs?

One specific method against this mind trap is being humbly gullible

It strikes me that if you start off too gullible you begin with an important skill: you already know how to change your mind. In fact, changing your mind is in some ways your default setting if you're gullible. And considering that like half the freakin sequences were devoted to learning how to actually change your mind, starting off with some practice in that department could be a very good thing.

Another is to practice Steelmanning as long as you avoid the dangers of steelmanning. Especially applicable is Steelmanning Inefficiency.

More general advice can of course be found in the Twelve Virtues of Rationality. See also the concise and improved versions.

 

New Comment
44 comments, sorted by Click to highlight new comments since:

Some of those don't appear to be fully general counterarguments.

for example: "Meta: The opponent is using fully general counterarguments."

does not work if your opponents counterarguments aren't examples of fully general counterarguments.

After all, this isn't just a list of things people can shout.That would just be a list of annoying arguments.

Rather, it's fine to say "that's a FGCA" if it's a FCGA, and not fine if it's not.

FGCAs derail conversations. Categorizing "that's a FGCA" as a FCGA is feeding the trolls.

If someone accuses you of making a FGCA when you didn't, you can always just explain why it's not a FGCA. Otherwise, you f**ked up. Admit your error and apologize.

A few more FGCAs. Some of these aren't really arguments, but anyone mounting an FGCA is unlikely to be concerned about epistemological propriety, and all of these things can be used together in support of each other.

  1. "Check your privilege."

  2. "Check your premises."

  3. Psychologising the opposition.

  4. "This is settled science."

  5. Emphasis and reassertion of the thesis.

  6. "Oh. My. God."

  7. "*facepalm*"

  8. "Ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha!"

  9. Satire.

  10. Parody.

  11. Pretty much anything, said in your own echo chamber. NB. Link to your allies' actual postings, but not to your enemies'. A link to the top of their blog is better.

  12. Cite some other blogger and say they've completely eviscerated the enemy.

  13. Irregular verbs. ("We are right/You are wrong." "These studies prove/Those studies are flawed." "We have the support of.../You are in the pocket of...")

And a Fully General way of generating all of these and more: bottom-lining. Nail your desired conclusion to a wall marked "TRUE" and your opponents' to a wall marked "FALSE", and ask the virtual outcome pump in your head to fill in the empty space on the walls. There is no conclusion, however absurd or repellent, that you cannot do this with. The smarter and better educated you are, the more puzzle pieces you have available to play with and the more easily you can invent new ways to put them together. Related quote from Foucault's Pendulum.

Someone said to me "you're just repeating a lot of the talking points on the other side."

I pointed out that this was just a FGCA, so they linked to this post and said "Oh what tangled webs we weave when first we practice to list Fully General Counter Arguments. Of course that sentiment probably counts as a Fully General Counterargument: Round like a circle in a spiral, like a wheel within a wheel. Never ending or beginning on an ever spinning reel." Did I break him?

[-][anonymous]20

To add to this list, Snark. It's way more effective than satire or parody because you can claim innocence if called out on it.

[-][anonymous]00

On psychologizing: the problem is I think you have to do that when you find out you have a connotation problem on your hand, not a denotation problem. You see some guy giving PUA (pick-up artist) advice. The denotation, actually actionable ideas how to find a girlfriend may or may not work, you don't know. But you see really problematic connotations behind the words, the usage of words, they tend towards hostility, enmity felt towards women, tend towards a cultish mindset, tend towards a manipulating guys through their sense of pride and so on. But it is the connotation, not the denotation. You can debate denotations rationally but connotations not, and yet it seems the main problems always from connotations.

I must admit the only solution for the connotation problem I found and it really does not work today is that you must have a community with a shared sense of connotations and people may disagree only in the denotations. So they can argue about ideas rationally but they must feel emotionally the same way about the major concepts and simply exclude everybody else.

For example I am one of those atheists who can argue with religious folks because my denotation about the Catholic church is that it is about worshipping something that does not exist, but my general connotation / feeling is that it is a nicely moderate civilizing force led by quite logical folks. So my attitude is so positive that they don't get defensive and angry and we can stick to the rational, denotational level. But of course today you cannot arrange things so that only those atheists who don't hate religion should debate it. So it does not really work.

I don't see any way to keep people denotationally rational but to enforce connotational harmony somehow which is essentially policing feelings. I know some folks here who would restore capital punishment. To have a productive debate we must first ensure we connotationally feel the same, such as, we feel life is relative sacred but not absolutely so, or something along those lines, before we can engage it productively.

I seriously don't know how else to deal with connotational problems.

"You are one of the many people in your cult who assert P. But there are also a lot of people in your cult who assert ~P. Therefore, people in your cult believe the contradiction P & ~P, and you are all idiots who should quit your deceptive mind-killing cult and join mine."

"Yeah, I used to believe that too when I was your age."
"Yeah, I used to believe that too when I didn't have the education / resources / cult-membership to know better."
"Yeah, I used to believe that too until I realized it was stupid."
"Yeah, I used to believe that too when I hung out with bad people."
"Yeah, I used to believe that too when I was high all the time / before I'd ever dropped acid."
"Yeah, I used to believe that too before I converted / deconverted."
"Yeah, I used to believe that too when I was rich / poor."
"Yeah, I used to believe that too before I got therapy / Jesus / a copy of The Fountainhead."

The cult one can be legitimate if the cult claims to have a source of absolute, no-interpretation-needed, truth and various cult members use it as reason to believe (separately) in P and ~P. While no individual person believes contradictory things, just the fact that the cult's "absolute truth" can be used to deduce two contradictory things is a sign that it isn't very good as a source of absolute truth.

Another one:

"Yes, everyone thinks they're a special snowflake that this doesn't apply to."

Couldn't this be a legitimate application of the outside view?

Like stopped clocks, all Fully General Arguments sometimes have true conclusions.

I think it would be good to separate the analysis into FGCA's which are always fallacious, versus those that are only warning signs/rude. For instance, the fallacy of grey is indeed a fallacy, so using it as a counter-argument is a wrong move regardless of its generality.

However, it may in fact be that your opponent is a very clever arguer or that the evidence they present you has been highly filtered. Conversationally, using these as a counter-argument is considered rude (and rightly so), and the temptation to use them is often a good internal warning sign; however you don't want to drop consideration of them from your mental calculus. For instance, perhaps you should be motivated after the conversation to investigate alternative evidence if you're suspicious that the evidence presented to you was highly filtered.

I'd add one for the internet age which I know myself to be guilty of using on occasion though not without some justification:

"Your perception of [groups behavior]/[scale of problem]/[frequency of event]/[fraction of people who agree with you] is largely a product of the filter bubble in which you and your social circle interacts"

It's technically true of almost everything and of almost everyone's perceptions but as such can be applied in almost any argument against either side.

The iterated "but how do you know that" also works as a FGC, or gets your opponent to admit he is relying on an unjustified assumption.

This is a good one. More generally, it's sometimes called the "Why" Regress. Not just about how you know something, but about how something happened or came to be. It applies equally to science and religion.

Edit: "...know you know" => "...how you know"

The Big Lebowski: "Yeah, well, you know, that's just, like, your opinion, man".

What are known ways to avoid FGCAs?

Observing that it Proves Too Much. If it is Fully General, it always does.

I like this idea, and I'd like to see it developed further. I don't see any reason why FGCAs shouldn't be catalogued and learned alongside logical fallacies for the same reasons.

I guess the important distinction would be that certain FGCAs can be used non-fallaciously, and some of these seem to have valid use-cases, like pointing out confirmation bias and mind-projection fallacy. Others are fallacious in their fully-general form, but have valid uses in their non-fully-general forms, so it is important to distinguish these. (e.g. pointing out vagueness or that something is too complicated and has too many dependencies for a given argument to have much weight.)

Great post!


I apologize for mentioning this, but there were a lot of typos in this, which made it a bit hard to read. I want to link this to a few friends who are not LWers, but when I am not familiar with the source of something, typos make me question the credibility of the author (they also provide an easy excuse to discount things people don't want to hear). I don't want that to happen when I show people, so I figured I'd help you out if you feel like cleaning it up a bit. Here's a quick list I put together for you:

  • Add comma after "But if (s)he is not aware of that"
  • Change "prone of" to "prone to"
  • "counter measures" should be "countermeasures"
  • "against which" should be "against whom" in "...clever arguer against which"
  • Add comma after "humble stance"
  • Change "FCGA" to "FGCA" in first bullet of The List and in first sentence under headings of both Self-Sealing Belief and Preventative Action
  • The third bullet is empty and fourth bullet seems like it is supposed to contains sub-cases of the missing third bullet
  • Under Nihilism, "Live" should be "Life" and the "-" needs to be closed after "including arguments"
  • "I don't like your opinion but I you are may have your own."
  • "Yur" => "Your" after "Humans are different"
  • In "the thing you are arguing about has evolved and exists just because of that, not because it is true or a valid argument." Just because of what? Evolution?
  • Opened paren but no contents or close-paren: "...more likely and more stable ("

Awesome. Thank you for the very actionable response.

Typos fixed.

Another:

"What observation would convince you?"

The beauty of this one is that the better established the idea you're arguing against, the stronger this argument becomes.

That's not exactly a counterargument, more a way to establish whether the other person could be swayed in any way.

"How could there be such a thing! it is true so there can be no such thing!" (fair enough, walk away, no point)

vs

"Well if we found some concrete and repeatable/observable example of matter/energy being created or destroyed that couldn't be explained then I'd accept that the law of conservation of energy could be bunk"

As an argument it doesn't actually counter anything but it might cause some of the audience/participants to give up and walk away.

"Well if we found some concrete and repeatable/observable example of matter/energy being created or destroyed that couldn't be explained then I'd accept that the law of conservation of energy could be bunk"

"Aha! So you ADMIT that scientists can't prove perpetual motion impossible. They educated stupid suppress my work because they have no answer! Time is four-sided!"

FGAs do not come alone.

to which the best response is to give up and walk away, if the person is arguing from a fundamentally different set of precepts FGA or no there's no point.

that same person could also argue that their position is true because Kermit the frog has decreed as such.

The insidious subtly of FGA's is that in a rational argument between sensible people an FGA can be used without anyone saying things which are obviously untrue or obviously stupid/insane. It's simply that an FGA is so broad that either side could use the FGA and have it feel like it supports their position.

everybody is entitled to his own opinon - I don't like your opinion but you may have your own. It's not important that we differ.

Why not call it relativism, because that is what it is?

One particular strategy is the claim of "self" contradiction, when the contradiction is not between statements the opponent makes, but between some statements the opponent makes, and some statements that are deduced or inferred from what the opponent said and your own beliefs and concepts.

The fact that we are disagreeing is a strong indicator that there are contradictions between your priors and my priors and structural commitments on similar propositions, so that finding evidence of such contradictions is not really much of an indicator that you are wrong, let alone that you contradicted yourself.

Particularly when the claim of "self" contradiction comes from A critiquing B's theory without feedback or response from B, it's very easy for A to engage in a self congratulatory kabuki argument against B, then shutter his mind from further consideration of B's argument.

One particular strategy is the claim of "self" contradiction, when the contradiction is not between statements the opponent makes, but between some statements the opponent makes, and some statements that are deduced or inferred from what the opponent said and your own beliefs and concepts.

This is actually a common result of attempting to steelman your opponent's argument.

My opponent must be secretly a shill, no wonder my arguments failed to sway his opinion. And I have no obligation to waste my time listening to shills and arguing with them.

Another FGCA (actually more general than that, it's an FGA):

"When I say X, you should update in that direction, because a world in which X is true is one in which you are more likely to hear X than one where it is false."

Donald Trump used a FGC when interviewed today to get out of a trap. I don't recall what he said verbatim but it went something like this:

Reporter: "Have you ever said something you regret?" Trump: "Of course." Reporter: "What was it?" Trump: "Now isn't the time to discuss it."

Well, that's a rather elegant way to get out of the trap, and the reporter wasn't exactly practicing good epistemic hygiene himself.

Another:

Argument by authority, lately reinvented as the Outside View. That may not sound fully general, but it's all in the choice of reference class.

A common one that I see works like this: first person holds position A. A second person points out fact B which provides evidence against position A. The first person responds, "I am going to adjust my position to position C: namely that both A and B are true. B is evidence for C, so your argument is now evidence for my position." Continue as needed.

Example:

First person: The world was created. Second person: Living things evolved, which makes it less likely that things were created than if they had just appeared from nothing. First person: The world was created through evolution. Facts implying evolution are evidence for this fact, so your argument now supports my position.

Continuing in this way allows the first person not only to maintain his original position, even if modified, but also to say that all possible evidence supports it.

(The actual resolution is that even if the modified position is supported by the evidence in issue, the modified position is more unlikely in itself than the original position, since the conjunction requires two things to be true, so following this process results in holding more and more unlikely positions.)

This is counterable, by pointing out that movement has occurred. If done honestly, it constitutes convergence, which is arguably desirable,

Experts are fallacious, thus you can't trust them

One very common example of this I see abused all the time is the argument in the form of "(X) was wrong about (Y) so therefore experts are worthless!"

Another FGA:

Setting the agenda/framing the discussion.

I don't know who first made the observation, but it's a commonplace of politics that if you can set the agenda, it doesn't matter how people vote.

Any time you find yourself crowing "so you admit that in some circumstances you would...!", you have probably committed this one.

Critical rationalists had a similar concept which they called an "immunizing strategy".

For those interested in the concept, it's probably worthwhile to look there for more info.

With a little poking around, I found a paper that gives a typology for immunizing strategies. I've only given it the most cursory glance. Caveat emptor. YMMV. IANAL.

[-][anonymous]-10

You're modelling and/or simulating the wrong entity/object/concept, thus committing the ecological fallacy of individual and aggregate correlates

Another Fully General Argument:

"It's not proof, but it's weak Bayesian evidence."

I think the instinctual response to that argument should be an request to quantify the evidence. If someone says "Bayesian evidence" they should provide real numbers.

No, that's a perfectly good argument unless your opponent can provide stronger evidence in the other direction.

Arguments are not resolved by seeing who can pile it higher and deeper. Not everything that is claimed to be evidence is.

If there weak Bayesian evidence for one side of the issue and no evidence for the other side that means in total the issue is undecided and both sides can be true.

That isn't really fully general because not everything is evidence in favor of your conclusion. Some things are evidence against it.

It is fully general at least in the sense that it admits a weak response which at the same time simulates compromise and weaking the other position.