I am currently writing fiction that features protagonists that are EAs.
This seems at least related to the infrastructure fund goal of presenting EA principles and exposing more people to them.
I think receiving a grant would make me more likely to aggressively pursue options to professionally edit, publish, and publicize the work. That feels kind of selfish and makes me self-conscious, but also wouldn't require a very large grant. It's hard for me to unwrap my feelings about this vs. the actual public good, so I'm asking here first.
Does this sounds like a good grant use?
I am reasonably excited about fiction (and am on the Long Term Future Fund). I have written previously about my thoughts on fiction here:
The track record of fiction
In a general sense, I think that fiction has a pretty strong track record of both being successful at conveying important ideas, and being a good attractor of talent and other resources. I also think that good fiction is often necessary to establish shared norms and shared language.
Here are some examples of communities and institutions that I think used fiction very centrally in their function. Note that after the first example, I am making no claim that the effect was good, I’m just establishing the magnitude of the potential effect size.
- Harry Potter and the Methods of Rationality (HPMOR) was instrumental in the growth and development of both the EA and Rationality communities. It is very likely the single most important recruitment mechanism for productive AI alignment researchers, and has also drawn many other people to work on the broader aims of the EA and Rationality communities.
- Fiction was a core part of the strategy of the neoliberal movement; fiction writers were among the groups referred to by Hayek as "secondhand dealers in ideas.” An example of someone whose fiction played both a large role in the rise of neoliberalism and in its eventual spread would be Ayn Rand.
- Almost every major religion, culture and nation-state is built on shared myths and stories, usually fictional (though the stories are often held to be true by the groups in question, making this data point a bit more confusing).
- Francis Bacon’s (unfinished) utopian novel “The New Atlantis” is often cited as the primary inspiration for the founding of the Royal Society, which may have been the single institution with the greatest influence on the progress of the scientific revolution.
On a more conceptual level, I think fiction tends to be particularly good at achieving the following aims (compared to non-fiction writing):
- Teaching low-level cognitive patterns by displaying characters that follow those patterns, allowing the reader to learn from very concrete examples set in a fictional world. (Compare Aesop’s Fables to some nonfiction book of moral precepts — it can be much easier to remember good habits when we attach them to characters.)
- Establishing norms, by having stories that display the consequences of not following certain norms, and the rewards of following them in the right way
- Establishing a common language, by not only explaining concepts, but also showing concepts as they are used, and how they are brought up in conversational context
- Establishing common goals, by creating concrete utopian visions of possible futures that motivate people to work towards them together
- Reaching a broader audience, since we naturally find stories more exciting than abstract descriptions of concepts
(I wrote in more detail about how this works for HPMOR in the last grant round.)
I've got some partial outlines for what I think are interesting sci-fi that I've wanted to pay to have ghostwritten or turned into a short film. Is this the right place for that?
Maybe, but really depends on whether you have a good track record or there is some other reason why it seems like a good idea to fund from an altruistic perspective.
I largely agree with Habryka's perspective. I personally (not speaking on behalf of the EA Infrastructure Fund) would be particularly interested in such a grant if you had a track record of successful writing, as this would make it more likely you'd actually reach a large audience. E.g., Eliezer did not just write HPMoR but was a successful blogger on Overcoming Bias and wrote the sequences.
I expect EA Funds – and the Long-Term Future Fund in particular – to be of interest to people on LessWrong, so I'm crossposting my EA Forum post with the excerpts that seem most relevant:
Summary
Recent updates
(…)
Long-Term Future Fund
The Long-Term Future Fund aims to positively influence the long-term trajectory of civilization, primarily via making grants that contribute to the mitigation of global catastrophic risks. Historically, we’ve funded a variety of longtermist projects, including:
See our previous grants here. Most of our grants are reported publicly, but we also give applicants the option to receive an anonymous grant, or to be referred to a private donor.
The fund has an intentionally broad remit that encompasses a wide range of potential projects. We strongly encourage anyone who thinks they could use money to benefit the long-term future to apply.
(…)
What types of grants can we fund?
For grants to individuals, all of our funds can likely make the following types of grants:
We can refer applications for for-profit projects (e.g., seed funding for start-ups) to EA-aligned investors. If you are a for-profit, simply apply through the standard application form and indicate your for-profit status in the application.
For legal reasons, we will likely not be able to make the following types of grants:
Please err on the side of applying, as it is likely we will be able to make something work if the fund managers are excited about the project. We look forward to hearing from you.
Apply here.