Is there a name for the (I claim) extremely common practice of blithely and unconsciously always looking at your own view (political especially) in terms of its best possible outcomes, while always characterizing an opposing point of view by its worst possibilities?

If not, I think there should be.  It seems like a major major source of unfruitful argumentation.

New to LessWrong?

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 12:00 PM

Scott Lemieux once called this the "my-utopia-versus-your-grubby-reality asymmetry," a delightful turn of phrase which has stuck with me since I read it.

Although Lemieux was talking about something subtly different from, or possibly a subset of, what you're talking about: the practice of describing the benefits of your own preferences as if you could completely and down to the smallest detail redesign the relevant system from scratch, while insisting on subjecting your opponent's preferences to a rigorous "how do we get there from here" analysis with all the compromises, imperfections, and unforeseeable obstacles the real world always entails.

I like that, but maybe it's just a bit too long to stick.

It seems so automatic in so many people, that I wouldn't be surprised if one day an associated neural mechanism was discovered.

It's like the opposite of considering the Least Convenient Possible World; the Most Convenient Possible World! Where everything on my side turns out as well as possible, and everything on yours turns out as badly as possible.

I don't have the answer but would be extremely interested in knowing it.

(Sorry this comment isn't more helpful. I am trying to get better at publicly acknowledging when I don't know an answer to a useful question in the hopes that this will reduce the sting of it.)

The general case in reference to libertarianism has started to be referred to as "Reductio Ad Somalium."

Specific cases may be instances of "Inconsistent Comparison," "Mind Projection Fallacy," or "Nirvana Fallacy," depending on the case in question. As an example of each, from my personal experiences:

Inconsistent Comparison: This is -extremely- common in political arguments, particularly concerning gun control or healthcare; it's basically a case of selectively picking statistics out for comparison with specific countries. Individual comparisons don't qualify, incidentally, it's when somebody picks and chooses comparisons to make, over the course of an argument, that it starts to qualify. Talking about gun crime in Australia, and then violent assault in Japan, and then... (etc)

Mind Projection Fallacy is hard to pin down to particular cases; it's -too broad- a fallacy, and pervasive throughout every political argument. Functionally, however, anytime assumes their favored policies will work, and compare their working policies to your assumed-to-fail policies, that's an example. (Evidence-based arguments are not the same. This is a tricky one, because the difference between projection and incorrect belief based on erroneous information is hard to pin down.) It's most common in thing like social justice, where arguers assume society is the way they observe it. (This pertains to -both- sides of any such debate.) "Privilege" arguments are closely related to this fallacy, and tend to both point out cases of it and represent cases of it.

Nirvana Fallacy: Comparison to some perfect, yet generally unnamed alternative; or comparison to the perfect case. It's common in political arguments for people to point out that libertarianism results in some people living in poverty; this is a case of Nirvana Fallacy, as -every- governmental form yet tested has -also- resulted in some people living in poverty; the Nirvana Fallacy is most common in cases where no specific alternative is offered.

Special pleading?

[-][anonymous]11y20

It seems like a variation of moving the goalposts or some similar double standard.

I propose calling it Implementation Success Bias, or maybe just Expected Success Bias.

There is a much more serious subset of this when people maybe-consciously look at the worst possible outcome of their opponent's view, and then claim that their opponent is eeeevil and specifically desires that outcome.

I have to say I agree with being annoyed at how whenever people have a Great Idea, they tend to COMPLETELY ignore the actual implementation of the idea (often jumping to considering problems once the idea is implemented).

Motivated overconfidence? :D