This seems like a good explanation of the dynamic underlying “skin in the game” considerations. If you care about literally achieving the stated goal, you should strongly prefer stories from contexts where a story’s prominence has more to do with its entanglement with reality than with marketability.
What is a story?
It seems like it's a sort of compression optimized for human brains. Some elements of a story:
It's all stories. There probably _is_ an underlying physical reality, but no humans experience it directly enough to have goals about it. I don't think your dichotomy is about reality vs stories. From your examples and descriptions, it seems to be about long-term vs short-term stories, or perhaps deep vs shallow stories.
The manager/ceo/board/investor acceptance of stories is only for a few years. Eventually customers won't agree, and it collapses anyway. conversely, there are plenty of examples of objectively worse products that did better in the marketplace, because the story is the only thing that matters.
Paper currency is a good example of a story with the weight of reality for a good chunk of humanity.
I don’t think you can escape stories entirely. I would claim that as soon as you summarize your facts or data, the mere selection of which facts to present or summarize is the crafting of a story. Even dumping all your data and every observation is likely to be biased by which data you collected and what you paid attention to. What you thought were the relevant things to report to another person.
I think we can say something stronger than this: we can't escape stories at all, because stories seem to be another way of talking about ontology (maps), and we literally can't talk about anything without framing it within some ontology.
It's tempting to want to claim direct knowledge of things, even if you are an empiricist, because it would provide grounding of your observations in facts, but the reality seems to be that everything is mediated by sensory experience at the least (not to mention other ways in which experience is mediated in things as complex as humans), so we are always stuck with at least the stories that our sensory organs enable (for example, your experience of pressure waves in the air as sound). I'd say it goes even deeper than that, being a fundamental consequence of information transfer via the intentional relationship between subject and object, but we probably don't need to move beyond a pragmatic level in the current discussion.
This is also why I worry, in the context of AI alignment, that Goodharting cannot be eliminated (though maybe we can mitigate it enough to not matter): representationalism (indirect realism) creates the seed of all misalignment between reality and the measurement of it, so we will always be in active effort to work against a gradient that seeks to pull us down towards divergence.
I probably didn't emphasize this enough in the main post, but the idea I'm really going for is that there is difference in optimizing for stories vs. optimizing for reality. There's a difference in goal and intention. Even if it's the case that human are never seeing "rock-bottom reality" itself and everything is mediated through experience, there is still a big difference between a) someone attempting to change an aspect of the underlying reality such that actual different things happen in the world, and b) someone attempting to change the judgments of another person by inputting the right series of bits into them.
Optimizing stories is really about a mono-focus on optimizing the specific corners of reality which exists inside human heads.
Of course, truly effective tea plus a well-conveyed story about its great properties will generate more sales than effective tea or a good story alone.
Sometimes not, I think. It's almost like a measure of the efficiency / effectiveness of the given market. If the market is really good at recognizing reality, then you don't need to tell a story. (Basic software libraries are like that: do they give the compute the right thing? If yes, then it's good.) If the market is not at recognizing reality, then creating stories is often way cheaper than then doing the real thing. (And also transfers better across domains.)
I don't like to work with software developers who believe that just because a software package computes the right thing it's good.
I care about many attributes of a package from it's documentation, it's API and it's likely future maintanence.
I think it might be interesting to discuss how story analysis differs from signalling analysis since I expect most people on Less Wrong to be extremely familiar with this. One difference is that people are happy to be given a story about you even if it is imperfect so that they can slot you into a box. Another is that signalling analysis focuses on whether something makes you look good or bad, while story analysis focuses on how engaging a narrative is. It also focuses more on how cultural tropes shape perspectives - ie. the romanticisation of bank robbers.
I feel you ignore that stories are central for personal motivation. If you keep the story of why you are doing what you are doing in life separate from yourself you are likely going to suffer from a lot of akrasia.
I don’t understand how this post makes that mistake - what’s an example of something Ruby said that’s inconsistent with this?
Personal motivation isn't in the list of "optimizing for stories" and and it assumes that you can optimize for success at your stated goal without story telling.
The word akrasia doesn't even appear in the post when it's vitally important as many people in this community suffer from akrasia as a result of lacking a good story to tell themselves that involves them engaging in the behavior that they consider to be desireable based on factual analysis.
Epistemic status: highly confident in the basic distinct, though it's not at all profound. Further details, models, and advice are somewhat speculative despite being drawn from varying amounts of observation. Essay a little rushed since otherwise I’d be unlikely to publish.
When setting out on a venture, one faces some choices.
Ideally, optimizing for one would be the same as optimizing for another. Very often, however, optimizing one is not the same as optimizing for other. What’s worse, the two ends compete for the same set of limited resources.
A concrete example might be someone who sets out to develop and sell a medicinal tea which relieves hangover symptoms. Overall success is selling your tea and making money. Investing in optimizing reality would mean investing in experiments to develop and improve the tea. Introducing variants, randomized controlled trials, etc., etc. Optimizing story means having a really good explanation for why your tea is able to do what you claim it does, plus having a great website with good copy, publishing testimonials, publishing your methodology and experimental results, etc.
Success can be attained by both, in combination, but also possibly each on its own. If you have a tea which works well, word will spread and you’ll end up with many customers seeking your genuinely palliative tea. Alternatively, if your marketing materials are persuasive, you might accrue many customers too, even if your tea is no better than placebo. Of course, truly effective tea plus a well-conveyed story about its great properties will generate more sales than effective tea or a good story alone.
Optimizing Reality
The following points are worth noting:
Optimizing Stories
“Story Economies”
I conjecture that what arises in our modern world are “economies” of stories whereby people buy and sell stories often without regard for reality.
Another example: imagine an analyst working at a startup crafts a report which highlights all the ways in which the company is rapidly improving. The analyst's manager isn’t too worried about whether the report is a bit biased towards the positive - they know the CEO will be pleased. The CEO doesn’t mind if the report is a bit biased towards the positive - they know the board will be pleased. The board doesn’t mind if the report is a bit biased - they know that the next round of investors won’t really be able to tell the difference, it will just make the company look good.
Here you have a whole chain of people who only care about the story. At the very end there’s someone who cares about the reality, but they’re very often not in a great position to evaluate it themselves. They probably don’t know even the right questions to ask. All they’ve got is the story which has been placed before them.
Some people gravitate more towards stories than others, e.g. salespeople and politicians. Some of them might readily admit that they chiefly deal in stories somewhat tenuously linked to reality, yet I wager that many, if not most, won’t. The most persuasive stories are those you devoutly believe yourself. Hence the vast overconfidence of startup founders. And, in the immortal words of George Costanza: it's not a lie... if you believe it.
Stories about yourself ...to yourself and others
If there’s one domain where we’re endlessly crafting and broadcasting stories, it’s the stories we tell about ourselves. I’m this kind of person. I might decide that the story I want to tell is that I’m a "science nerd". So I read science books and science magazines. I have my answer ready when people ask me what I do for fun and I know exactly what to post on social media. My Instagram is full of homemade volcanoes and photos from my personal backyard telescope.
The above fictional example might have the redeeming feature that at least this fictional person is creating a genuine reality to match the story. They are learning a tonne of science genre facts. Still, I wager there’s a tradeoff. Doing science-y things which are easily communicable and demonstrable introduces a constraint. Possibly leveling up as a scientist would mean reading textbooks with facts that are incomprehensible and boring to those not at that level. By trying to have the best story to tell, they’ve handicapped their own excellence. (However, if the story is primarily for oneself, this constraint is avoided. “I know just how science-y I am!)
If you want to know, I tell myself the story that I’m a person who’s afraid of losing myself to trying craft myself into someone optimized for impressing others. Though I do it. I’m doing it right now. There may be no escape.
Cf. Elephant in the Brain.
You can’t escape stories
At this point you might be thinking, “gee, stories are awfully deceitful and non-cooperate, I want to cooperative and honest and I’m just going to provide direct facts!” and “I really, really don’t want to deceive myself with stories!”
I don’t think you can escape stories entirely. I would claim that as soon as you summarize your facts or data, the mere selection of which facts to present or summarize is the crafting of a story. Even dumping all your data and every observation is likely to be biased by which data you collected and what you paid attention to. What you thought were the relevant things to report to another person.
That said, I think there’s storytelling which attempts to be honest effort to share reality as is so that someone else can make an informed judgment. It’s challenging if one’s success is threatened by less scrupulous competitors, but it’s possible to choose domains where measurable feedback favors those who’ve optimized actual reality.
You’re not always doing others a favor if you try to give them raw facts with no biased conclusions. The world is large and messy and confusing such that people usually like to be handed a story about who you are and how you will behave. They want you to be a nerdy, bookish type, or an outdoorsy type, or a foodie. If you give me a story and promise to act in accordance with it, that makes simple. It’s clear what to talk about, what to get you for your birthday, etc., etc.
At least for those spending much time out in mainstream culture, it helps to have one or two stories prepared about yourself. “Masks.” They function a bit like APIs, really. People often protest that they don’t like being put in boxes, but those boxes help you relate to people before you’ve spent the many, many hours to have absorbed the messy reality that any given human is.
What to do, what to do
Reality on the ground is complex, incentives are messy, things which work in the short run don’t necessarily work in the long run. I can't say “here’s my one simple recipe to determine the right allocation of resources between optimizing story vs optimizing reality.”
I proffer the obvious advice: