How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence
It took me until I read The Things They Carried for the third time until I realized that it contained something very valuable to rationalists. In "The Logical Fallacy of Generalization from Fictional Evidence," EY explains how using fiction as evidence is bad not only because it's deliberately wrong in particular ways to make it more interesting, but more importantly because it does not provide a probabilistic model of what happened, and gives at best a bit or two of evidence that looks like a hundred or more bits of evidence.
Some background: The Things They Carried is a book by Tim O'Brien that reads as an autobiography where he recollects various stories from being a story in the Vietnam War. However, O'Brien often repeats himself, writing the same story over again, but with details or entire events that change. It is actually a fictional autobiography; O'Brien was in the Vietnam War, but all the stories are fictional.
In The Things They Carried, Tim O'Brien not only explains how generalization from fictional evidence is bad, but also has his own solution to the problem that actually works, i.e. gives the reader a useful probabilistic model of what happened in such a way that actually interests the reader. He does this by telling his stories many times, changing significant things about them. Literally; he contradicts himself, writing out the same story but with things changed. The best illustration of the principle in the book is the chapter "How to Tell a True War Story," found here (PDF warning, and bad typesetting warning).
A reader is not inclined to read a list of probabilities, but they are inclined to read a bunch of short stories. He talks about this practice a lot in the book itself, writing, "All you can do is tell it one more time, patiently, adding and subtracting, making up a few things to get at the real truth. … You can tell a true war story if you just keep on telling it." He always says war story, but the principle generalizes. At one point, he has a character represent the forces that act on conventional writing, telling a storyteller that he cannot say that he doesn't know what happened, and that he cannot insert any analysis.
O'Brien also writes about a lot of other things I don't want to mention more than briefly here, such as the specific ways in which the model that conventional war stories give of war is wrong, and specific ways in which the audience misinterprets stories. I recommend the book very much, especially if you think writing "tell multiple short stories" fiction is a great idea and want to do it.
I apologize if this post has been made before.
EDIT: Tried to clarify the idea better. I added an example with an excerpt.
EDIT 2: Added a better excerpt.
EDIT 3: Added a paragraph about background.
How to offend a rationalist (who hasn't thought about it yet): a life lesson
Usually, I don't get offended at things that people say to me, because I can see at what points in their argument we differ, and what sort of counterargument I could make to that. I can't get mad at people for having beliefs I think are wrong, since I myself regularly have beliefs that I later realize were wrong. I can't get mad at the idea, either, since either it's a thing that's right, or wrong, and if it's wrong, I have the power to say why. And if it turns out I'm wrong, so be it, I'll adopt new, right beliefs. And so I never got offended about anything.
Until one day.
One day, I encountered a belief that should have been easy to refute. Or, rather, easy to dissect, and see whether there was anything wrong with it, and if there was, formulate a counterargument. But for seemingly no reason at all, it frustrated me to great, great, lengths. My experience was as follows:
I was asking the opinion of a socially progressive friend on what they feel are the founding axioms of social justice, because I was having trouble thinking of them on my own. (They can be derived from any set of fundamental axioms that govern morality, but I wanted something that you could specifically use to describe who is being oppressed, and why.) They seemed to be having trouble understanding what I was saying, and it was hard to get an opinion out of them. They also got angry at me for dismissing Tumblr as a legitmate source of social justice. But eventually we got to the heart of the matter, and I discovered a basic disconnecf between us: they asked, "Wait, you're seriously applying a math thing to social justice?" And I pondered that for a moment and explained that it isn't restricted to math at all, and an axiom in this context can be any belief that you use to base your beliefs on. However, then the true problem came to light (after a comparison of me to misguided 18th-century philosophes): "Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
And that did it. For the rest of the day, I wreaked physical havoc, and emotionally alienated everyone I interacted with. I even seriously contemplated suicide. I wasn't angry at my friend in particular for having said that. For the first time, I was angry at an idea: that belief systems about certain things should not be internally consistent, should not follow logical rules. It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.
I'm glad that I encountered that belief, though, like all beliefs, since I was able to solve it in the end, and make peace with it. I came to the following conclusions:
- In order to make a rationalist extremely aggravated, you can tell them that you don't think that belief structures should be internally logically consistent. (After 12-24 hours, they acquire lifetime immunity to this trick.)
- Belief structures do not necessarily have to be internally logically consistent. However, consistent systems are better, for the following reason: belief systems are used for deriving actions to take. Many actions that are oriented towards the same goal will make progress in accomplishing that goal. Making progress in accomplishing goals is a desirable thing. An inconsistent belief system will generate actions that are oriented towards non-constant goals, and interfere destructively with each other, and not make much progress. A consistent belief system will generate many actions oriented towards the same goal, and so will make much progress. Therefore, assuming the first few statements, having an internally consistent belief system is desirable! Having reduced it to an epistemological problem (do people really desire progress? can actions actually accomplish things?), I now only have epistemological anarchism to deal with, which seems to work less well in practice than the scientific method, so I can ignore it.
- No matter how offended you are about something, thinking about it will still resolve the issue.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)