You can't not think in terms of stories, that is simply how our minds work. All you can do is to try to keep that (in the form of "intuition") from preventing the adequate weighing of statistics, probabilities, and explicit evidence that can't easily be fit into narratives.
Added: Even when thinking with images or kinesthetically, a person can only use the images or feelings as isolated "facts" or as part of a consistent sequence which has all the same problems as verbal stories.
I wonder how many people here besides me lost their appetite for consuming monomyth-structured stories after their naturalistic awakening?
After my naturalistic awakening, I went on a journey, overcame an almost insuperable obstacle and then returned, having achieved a worthy reward.
Seriously, though - what makes you think you've lost your appetite for consuming monomyth-structured stories?
Some things come to mind: Nassim Taleb in his books criticizes this habit of story telling all the time. From the field of biases: scenario thinking(which is a way of mental storytelling). A reason for the planning fallacy is because a plan is essentially a good story we tell ourselves and others but we neglect all the details that mess it up.
As a counterpoint, see Dennett's idea of "The Self as a Center of Narrative Gravity" - narrative as an integral part of consciousness.
Consider the normative models against which we evaluate "biased" vs "unbiased" decisions, for instance expected utility. To even begin to apply such a model you'll need to have identified some set of decisions among which you are to choose - should I or shouldn't I eat this ice cream, drink this whiskey, turn down this job, whatever - and relevant consequences which vary in their utility: fit vs...
It's the people who realize they don't know anything at all that end up doing pretty well.
Sounds like a story to me...
I know, tis pretty old, but some remark: How about the upsides of stories. I mean... a) we apparently are in a way programmed to find/make up stories, because they help us make sense of the world. Isn't it good, to break complicated stuff down into simpler stories, then tell those stories and make the audience want to hear more (or find out more themselves)? b) they stick. If I want to remember something I make it into a story or try to find it's internal story (or I stupidly repeat it over and over again if I really don't get it).
Don't get me wrong - I a...
I got as far as "some things actually are good versus evil, we all know this, right?" at 4:00, and lost all respect for the man. I didn't watch the rest.
Other than how we treat them, what's the difference between a story and a theory or hypothesis?
Edit: I'm guessing from the downvote that I may've been misunderstood. The above question is not rhetorical; it's intended to spark conversation.
Interesting post, but not terribly useful at first glance - it started with what sounded like a good description of how I work, diverged from how I do things at "But we are happy using the word "good" for all of them, and it doesn't feel like we're using the same word in several different ways, the way it does when we use "right" to mean both "correct" and "opposite of left".", and wound up offering a different (though useful for dealing with others) solution to the problem than the very personally efficient one that I've been using for a few years now. I do actually feel the difference in the different meanings of 'good' (I haven't cataloged them (I don't see any personal usefulness in doing so - note that I don't think in words in general), but I estimate at least half a dozen common meanings and several rarer ones), but that's somewhat beside the point.
My fix for the presented problem involves the following heuristic: The farther from neutral my general opinion of a class of things is, the more likely it is to be incorrect in any given case. Generally, a generalized strong positive or negative opinion is a sign that I'm underinformed in some way - I've been getting biased information, or I haven't noticed a type of situation where that kind of thing has a different effect than the one I'm aware of, or I haven't noticed that it's related to another class of things in some important way. The heuristic doesn't disallow strong positive or negative generalized opinions altogether, but it does enforce a higher standard of proof on more extreme ones, and leads me to explore the aspects of things that are counter to my existing opinion in an attempt to reach a more neutral (and complex, which is the real goal) opinion of them. It still allows strong contextual reactions, too, which I haven't yet seen a problem with, and which do appear to be generally useful.
Regarding the concepts of good (in the 'opposite of evil' sense) and evil, my apparent non-neutrality is personal (which is a kind of persistent context) - they're more harmful than helpful in achieving the kinds of goals that I tend to be most interested in, like gaining a comprehensive understanding of real-world conflicts or coming to appropriately-supported useful conclusions about moral questions, and while they seem to be more helpful than harmful in the pursuit of other goals, like manipulating people (which I am neutral on, to a degree that most people I know find disturbing) and creating coherent communities of irrational people, I personally don't consider those things relevant enough to sway my opinion. Disregarding the personal aspects, I think I have a near-neutral opinion of the existence of the concepts, but it's hard to tell; I haven't spent much time thinking about the issue on that scale.
Edit: And I believed that this group has similar-enough interests to generate the same kind of 'personal' context. I may have been wrong, but I thought that they were generally more harmful than helpful in solving the kinds of problems that are considered important here and by the kinds of individuals who participate here. Otherwise, I wouldn't've mentioned the issue at all, like I usually don't.
My reaction in the original comment was contextual, in both the personal sense and with regards to the type of presentation it was, which follows a very different set of heuristics than the ones I use to regulate general opinions, and allows strong reactions much more easily, but limits the effects of those reactions to the context at hand - perhaps in a much stricter way than you (plural) are assuming. I haven't taken the time to note the presenter's name (and I'm moderately faceblind and not good at remembering people by their voices), so even another presentation by the same person on the same topic will be completely unaffected by my reaction to this presentation.
Tyler Cowen argues in a TED talk (~15 min) that stories pervade our mental lives. He thinks they are a major source of cognitive biases and, on the margin, we should be more suspicious of them - especially simple stories. Here's an interesting quote about the meta-level: