For some reason the narrative fallacy does not seem to get as much play as the other major cognitive fallacies. Apart from discussions of "The Black Swan", I never see it mentioned anywhere. Perhaps this is because it's not considered a "real" bias, or because it's an amalgamation of several lower-level biases, or because it's difficult to do controlled studies for. Regardless, I feel it's one of the more pernicious and damaging fallacies, and as such deserves an internet-indexable discussion.
From Taleb's "The Black Swan"
Essentially, the narrative fallacy is our tendency to turn everything we see into a story - a linear chain of cause and effect, with a beginning and an end. Obviously the real world isn't like this - events are complex and interrelated, direct causation is extremely rare, and outcomes are probabilistic. Verbally, we know this - the hard part, as always, is convincing our brain of the fact.
Our brains are engines designed to analyze the environment, pick out the important parts, and use those to extrapolate into the future. To trot out some theoretical evolutionary support, only extremely basic extrapolation would be required in the ancestral evolutionary environment. Things like [Gather Food -> Eat Food -> Sate Hunger] or [See Tiger -> Run -> Don't Die]. Being able to produce simple chains of cause and effect would confer a significant survival advantage, but you wouldn't need anything more than that. The world was simple enough that we didn't have to deal with complex interactions - linear extrapolation was "good enough". The world is much different and much more complex today, but unforunately, we're still stuck with the same linear extrapolation hardware.
You can see the results of this 'good enough' solution in the design and function of our brain. Cognitively, it's much cheaper to interpret a group of things as a story - a pattern - than to remember each one of them seperately. Simplifying, summarizing, clustering, and chaining ideas together - reducing complex data to a few key factors, lets us get away with, say having an extremely small working memory, or a relatively slow neuron firing speed. Compression of some sort is needed for our brains to function - it'd be impossible to analyze the terabytes of data we receive every second from our senses otherwise. As such, we naturally reduce everything to the simplest pattern possible, and then process the pattern. So we're much better at remembering things as part of a pattern than as a random assortment. The alphabet is first learned as a song to help it stick. Mnemonic devices improve memory by establishing easy to remember relationships. By default our natural tendency, for any information, is to establish links and patterns in it to aid in processing. This by itself isn't a problem - the essence of of knowledge is drawing connections and making inferences. The problem is that because our hardware is designed to do it, it insists on finding links and patterns whether they actually exist or not. We're biologically inclined to reduce complex events to a simpler, more palatable, more easily understood pattern - a story.
This tendency can be seen in a variety of lower level biases. For instance, the availability heuristic causes us to make predictions and inferences based on what most quickly comes to mind - what's most easily remembered. Hindsight bias causes us to interpret past events as obviously and inevitably causing future ones. Consistency bias causes us to reinterpret past events and behaviors to be consistent with new information. Confirmation bias causes us to only look for data to support the conclusions we've already arrived at. There's also our tendency to engage in rationalization, and create post-hoc explanations for our behavior. They all have the effect of of molding, shaping, and simplifying events into a kind of linear narrative, ignoring any contradiction, complexity, and general messiness.
Additionally, there's evidence that forming narratives out of the amalgamated behavior of semi-independent mental modules is one of the primary functions of consciousness. Dennet makes this argument in his paper "The Self as a Narrative Center of Gravity":
Because the brain is a hodge podge of dirty hacks and disconnected units, smoothing over and reinterpreting their behaviors to be part of a consistent whole is necessary to have a unified 'self'. Drescher makes a somewhat related conjecture in "Good and Real", introducing the idea of consciousness as a 'Cartesian Camcorder', a mental module which records and plays back perceptions and outputs from other parts of the brain, in a continuous stream. It's the idea of "I am not the one who thinks my thoughts, I am the one who hears my thoughts", the source of which escapes me. Empirical support of this comes from the experiments of Benjamin Libet, which show that a subconscious electrical processes precede conscious actions - implying that consciousness doesn't engage until after an action has already been decided. If this is in fact how we handle internal information - smoothing out the rough edges to provide some appearance of coherence, it shouldn't be suprising that we tend to handle external information in the same matter.
It seems then, that creating narratives isn't so much a choice as it is a basic feature of the architecture of our minds. From the paper "The Neurology of Narrative" (JSTOR), discussing people with damage to the area of the frontal lobe which processes higher order input:
You can see the extremes our tendency toward narrative can go with people who see themselves as the star or hero in a "movie about their life". These people tend to be severe narcissists (though I've heard some self help "experts" espouse this as a healthy outlook to adopt), but it's not hard to see why such a view is so appealing. As the star of a movie, the events in your life are all extremely important, and are building to something that will inevitably occur later. You'll face difficulties, but you will ultimately overcome them, and your triumph will be all the greater for it (we seldom imagine our lives as a tragedy). You'll fight and conquer your enemies. You'll win over the love interest. It's all immensely appealing to our most basic desires, whether we're narcissists or not.
A good story, then, is a superstimulus. The very structure of our minds is tilted to be vulnerable to it. It appeals to our primitive brains so strongly that it doesn't matter if it resembles the real world or not - we prefer the engaging story. We're designed to produce narratives, whether we like it or not. Fortunately, our minds also come with the ability to build new processes that can overrule the older ones. So how do we beat it? From "The Black Swan":
In other words, concentrate your probability mass. Force your beliefs to be falsifiable. Make them pay rent in anticipated experience. All the the things a good rationalist should be doing already.