Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Squark 25 April 2014 08:12:04PM 4 points [-]

This is an interesting technique for explaining one's "map": upvoted. However I think one should still be careful not to update on fictional evidence (more precisely to update on the fact that this is the map the given person constructed rather than update on the assumption this is the territory).

Comment author: mszegedy 26 April 2014 08:57:22PM 0 points [-]

Right, that's true. In the particular case of The Things They Carried, I'd trust O'Brien moderately well to depict what the Vietnam War was like, since he participated in it.

Comment author: MarkL 25 April 2014 09:51:49PM 3 points [-]

As encouragement to OP, I haven't read The Things They Carried either, but OP totally makes sense, and it's interesting and helpful, and I'm glad ze posted it. (... But now I realize OP has been edited before I got to it, so maybe parent applied more beforehand. :-)

Comment author: mszegedy 26 April 2014 01:33:29AM *  0 points [-]

Before I edited it, it was like the current one with the second paragraph removed, the last two sentences of the third paragraph removed, and the third and fourth paragraph combined into one, roughly. I'm glad gwern posted his comment, though, because I think the post is much better now.

Comment author: Gunnar_Zarncke 25 April 2014 06:41:26AM 1 point [-]

"You can tell a true war story if you just keep on telling it." He always says war story, but the principle generalizes.

I have read about half of that PDF and I see where it is getting. The part of a story which describes something true from real life. Which doesn't want to entertain but educate about some truth of life has to follow this pattern. That is the generalization.

Compare with e.g. HPMoR: The fictional world has a plot and may have a morale. But the truth in there is in true concepts and those are represented in fairly comparable way to the 'war stories'. Explanations of scientific methods are just that: Incomplete explanations. And for example the guessing and double guessing of Harry and Quirrell never clearly resolves. That is a truth of real life strategy. I hope it carries over to last arc.

Comment author: mszegedy 25 April 2014 05:13:39PM 2 points [-]

Are you sure you understood the point? I am highlighting a writing technique where you write the same short story over and over again slightly differently to convey a probabilistic model to the reader in a way that is interesting. HPMoR is not quite this; it's a different story every time, with a different lesson every time, that is treated as a sequence of events.

Comment author: Punoxysm 25 April 2014 01:36:23AM 1 point [-]

I'd say it was pretty unclear. There are many short story collections; most don't tell and retell the same story. Is he doing this literally, or just metaphorically (e.g. soldiers experience battle many times, and each time is similar but different?). And what is the "storyteller"? Is it told through a framing device?

Comment author: mszegedy 25 April 2014 01:49:49AM *  0 points [-]

He literally tells the same story over and over again, differently every time. He has several stories that he does this to. The book is a fictional autobiography; O'Brien was in the Vietnam War, and writes as though he were recollecting stories from the Vietnam War, but the stories are all made up. Here, I found an excerpt that illustrates the principle in a somewhat okay manner.

EDIT: Here, this is better (PDF warning).

Comment author: gwern 24 April 2014 11:26:48PM 2 points [-]

I'm afraid I've never read The Things They Carried or indeed, have any idea what it is, so I didn't find this very helpful.

Comment author: mszegedy 25 April 2014 01:08:39AM 1 point [-]

If you want, read it. Hopefully, though, the principle that I was highlighting was clear, wasn't it? While fiction with a probability distribution given for each sequence of events is boring, fiction with many short stories describing the different possible scenarios is interesting, and gives the same probabilistic model.

Should I give examples of how O'Brien does it? I don't know how much I can type out without violating copyright law.

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

9 mszegedy 24 April 2014 09:41PM

It took me until I read The Things They Carried for the third time until I realized that it contained something very valuable to rationalists. In "The Logical Fallacy of Generalization from Fictional Evidence," EY explains how using fiction as evidence is bad not only because it's deliberately wrong in particular ways to make it more interesting, but more importantly because it does not provide a probabilistic model of what happened, and gives at best a bit or two of evidence that looks like a hundred or more bits of evidence.

Some background: The Things They Carried is a book by Tim O'Brien that reads as an autobiography where he recollects various stories from being a story in the Vietnam War. However, O'Brien often repeats himself, writing the same story over again, but with details or entire events that change. It is actually a fictional autobiography; O'Brien was in the Vietnam War, but all the stories are fictional.

In The Things They Carried, Tim O'Brien not only explains how generalization from fictional evidence is bad, but also has his own solution to the problem that actually works, i.e. gives the reader a useful probabilistic model of what happened in such a way that actually interests the reader. He does this by telling his stories many times, changing significant things about them. Literally; he contradicts himself, writing out the same story but with things changed. The best illustration of the principle in the book is the chapter "How to Tell a True War Story," found here (PDF warning, and bad typesetting warning).

A reader is not inclined to read a list of probabilities, but they are inclined to read a bunch of short stories. He talks about this practice a lot in the book itself, writing, "All you can do is tell it one more time, patiently, adding and subtracting, making up a few things to get at the real truth. … You can tell a true war story if you just keep on telling it." He always says war story, but the principle generalizes. At one point, he has a character represent the forces that act on conventional writing, telling a storyteller that he cannot say that he doesn't know what happened, and that he cannot insert any analysis.

O'Brien also writes about a lot of other things I don't want to mention more than briefly here, such as the specific ways in which the model that conventional war stories give of war is wrong, and specific ways in which the audience misinterprets stories. I recommend the book very much, especially if you think writing "tell multiple short stories" fiction is a great idea and want to do it.

I apologize if this post has been made before.

EDIT: Tried to clarify the idea better. I added an example with an excerpt.

EDIT 2: Added a better excerpt.

EDIT 3: Added a paragraph about background.

Comment author: mszegedy 08 April 2014 06:37:28AM *  1 point [-]

I've found that going by significant digits helps.

"If I represented the date that Einstein came to the US with only one significant digit of precision, what would it be? Definitely 2000. What about two? Definitely 1900. What about three? Probably 1900 again; I'm willing to take that bet. But four digits of precision? I'm not sure at all. I'll leave it as 1900."

The answer came out way off, but hopefully it prevented any anchoring, and it also accurately represents my knowledge of Einstein (namely, I know which properties of physics he discovered, and I know that he wrote his most important papers in the earlier half of the 190Xs, which must have also been when he came to the US). In hindsight, I might have should have taken historical context into account (why would Einstein leave for the US in the first place? if I had considered this, my guess would probably have ended up as 1910 or 1920), but that's hindsight bias or a lesson to be learned.

An improvement to this method might be that I explicitly consider the range of numbers that would make it come out as a significant digit (if the three-significant-digit number is 1900, then he came between 1895 and 1904; does that sound more plausible than him coming sometime between 1905 and 1914?). But this might just make the anchoring effect worse, or introduce some other bias.

Comment author: Eugine_Nier 07 February 2013 04:11:27AM 2 points [-]

What do you mean by "positive feelings"? For example, would you support wireheading everyone?

Comment author: mszegedy 07 February 2013 04:50:49AM *  0 points [-]

That's exactly what I can't make my mind up about, and forces me to default to nihilism on things like that. Maybe it really is irrelevant where the pleasure comes from? If we did wirehead everyone for eternity, then would it be sad if everyone spontaneously disappeared at some point? Those are questions that I can't answer. My morality is only good for today's society, not tomorrow's. I guess strictly morally, yes, wireheading is a solution, but philosophically, there are arguments to be made against it. (Not from a nihilistic point of view, though, which I am not comfortable with. I guess, philosophically, I can adopt two axioms: "Life requires meaning," and "meaning must be created." And then arises the question, "What is meaning?", at which point I leave it to people with real degrees in philosophy. If you asked me, I'd try to relate it to the entropy of the universe somehow. But I feel that I'm really out of my depth at that point.)

Comment author: V_V 06 February 2013 07:19:41PM 11 points [-]

And that did it. For the rest of the day, I wreaked physical havoc, and emotionally alienated everyone I interacted with. I even seriously contemplated suicide. I wasn't angry at my friend in particular for having said that. For the first time, I was angry at an idea: that belief systems about certain things should not be internally consistent, should not follow logical rules.

This emotional reaction seems abnormal. Seriously, somebody says something confusing and you contemplate suicide?
What are you, a Straw Vulcan computer that can be disabled with a Logic Bomb ?

Unless you are making this up, I suggest you consider seeking professional help.

It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.

Actually, it's rather easy: just tell them that ex falso quodlibet.

Comment author: mszegedy 07 February 2013 02:43:56AM 1 point [-]

True, I swear! I think I can summarize why I was so distraught: external factors, this was a trusted friend, also one of my only friends, and I was offended by related things they had said prior. I am seeking help, though.

Comment author: shminux 06 February 2013 05:15:13PM 10 points [-]

I suspect that what frustrated you is not noticing your own confusion. You clearly had a case of lost purposes: "applying a math thing to social justice" is instrumental, not terminal. You discovered a belief "applying math is always a good thing" which is not obviously connected to your terminal goal "social justice is a good thing".

You are rationalizing your belief about applying math in your point 2:

An inconsistent belief system will generate actions that are oriented towards non-constant goals, and interfere destructively with each other, and not make much progress. A consistent belief system will generate many actions oriented towards the same goal, and so will make much progress.

How do you know that? Seems like an argument you have invented on the spot to justify your entrenched position. Your point 3 confirms it:

No matter how offended you are about something, thinking about it will still resolve the issue.

In other words, you resolved your cognitive dissonance by believing the argument you invented, without any updating.

If you feel like thinking about the issue some more, consider connecting your floating belief "math is good" to something grounded, like The Useful Idea of Truth:

True beliefs are more likely than false beliefs to make correct experimental predictions, so if we increase our credence in hypotheses that make correct experimental predictions, our model of reality should become incrementally more true over time.

This is reasonably uncontroversial, so the next step would be to ponder whether in order to be better at this social justice thing one has to be better at modeling reality. If so, you can proceed to the argument that a consistent model is better than an inconsistent one at this task. This may appear self-evident to you, but not necessarily to your "socially progressive" friend. Can you make a convincing case for it? What if s/he comes up with examples where someone following an inconsistent model (like, say, Mother Teresa) contributes more to social justice than those who study the issue for a living? Would you accept their evidence as a falsification of your meta-model "logical consistency is essential"? If not, why not?

Comment author: mszegedy 07 February 2013 02:36:36AM *  2 points [-]

You're completely right. I tried, at first, to look for ways that it could be a true statement that "some areas shouldn't have consistent belief systems attached", but that made me upset or something (wtf, me?), so I abandoned that, and resolved to attack the argument, and accept it if I couldn't find a fault with it. And that's clearly bad practice for a self-proclaimed rarionalist! I'm ashamed. Well, I can sort of make the excuse of having experienced emotions, which made me forget my principles, but that's definitely not good enough.

I will be more careful next time.

EDIT: Actually, I'm not sure whether it's so cut-and-dry like that. I'll admit that I ended up rationalizing, but it's not as simple as "didn't notice confusion". I definitely did notice it. Just when I am presented with an opposing argument, what I'll do is that I'll try to figure out at what points it contradicts my own beliefs. Then I'll see whether those beliefs are well-founded. If they aren't, I'll throw them out and attempt to form new ones, adopting the foreign argument in the process. If I find that the beliefs it contradicts are well-founded, then I'll say that the argument is wrong because it contradicts these particular beliefs of mine. Then I'll go up to the other person and tell them where it contradicts my beliefs, and it will repeat until one of us can't justify our beliefs, or we find that we have contradictory basic assumptions. That is what I did here, too; I just failed to examine my beliefs closely enough, and ended up rationalizing as a result. Is this the wrong way to go about things? There's of course a lot to be said about actual beliefs about reality in terms of prior probability and such, so that can also be taken into account where it applies. But this was a mostly abstract argument, so that didn't apply, until I introduced an epistemological argument instead. But, so, is my whole process flawed? Or did I just misstep?

View more: Next