I work at Open Philanthropy, and I recently let Gavin know that Open Phil is planning to recommend a grant of $5k to Arb for the second project on your list: Overview of AI Safety in 2024 (they had already raised ~$10k by the time we came across it). Thanks for writing this post Austin — it brought the funding opportunity to our attention.
Like other commenters on Manifund, I believe this kind of overview is a valuable reference for the field, especially for newcomers.
I wanted to flag that this project would have been eligible for our RFP for work that builds capacity to address risks from transformative AI. I worry that not all potential applicants are aware of the RFP or its scope, so I’ll take this opportunity to mention that this RFP’s scope is quite broad, including funding for:
More details at the link above. People might also find this page helpful, which lists all currently open application programs at Open Phil.
@Matt Putz thanks for supporting Gavin's work and letting us know; I'm very happy to hear that my post helped you find this!
I also encourage others to check out OP's RFPs. I don't know about Gavin, but I was peripherally aware of this RFP, and it wasn't obvious to me that Gavin should have considered applying, for these reasons:
I'm evidently wrong on all these points given that OP is going to fund Gavin's project, which is great! So I'm listing these in the spirit of feedback. Some easy wins to encourage smaller projects to apply might be to update the RFP page to 1. list some example grants and grant sizes that were sourced through this, and 2. describe how much time you expect an applicant to take to fill out the form (something EA Funds does, which I appreciate, even if I invariably take much more time than they state).
Thanks for the feedback! I’ll forward it to our team.
I think I basically agree with you that from reading the RFP page, this project doesn’t seem like a central example of the projects we’re describing (and indeed, many of the projects we do fund through this RFP are more like the examples given on the RFP page).
Some quick reactions:
We expect to make most funding decisions in 3 months or less (assuming prompt responses to any follow-up questions we may have), and we may or may not be able to accommodate requests for greater time-sensitivity. Applicants asking for over $500K should expect a decision to take the full 3 months (or more, in particularly complex cases), and apply in advance accordingly. We’ll let you know as soon as we can if we anticipate a longer than 3-month decision timeline. [emphasis in original]
Thanks for forwarding my thoughts!
I'm glad your team is equipped to do small, quick grants - from where I am on the outside, it's easy to accidentally think of OpenPhil as a single funding monolith, so I'm always grateful for directional updates that help the community understand how to better orient to y'all.
I agree that 3months seems reasonable when 500k+ is at stake! (I think, just skimming the application, I mentally rounded off "3 months or less" to "about 3 months", as kind of a learned heuristic on how orgs relate to timelines they publish.)
As another data point from the Survival and Flourishing Funds, turnaround (from our application to decision) was about 5 months this year, for an ultimately 90k grant (we were applying for up to 1.2m). I think this year they were unusually slow due to changing over their processes; in past years it's been closer to 2-3 months.
Our own philosophy at Manifund does emphasize "moving money quickly", to almost a sacred level. This comes from watching programs like Fast Grants and Future Fund, and also our own lived experience as grantees. For grantees, knowing 1 month sooner that money is coming, often means that one can start hiring and executing 1 month sooner - and the impact of executing even 1 day sooner can sometimes be immense (see: https://www.1daysooner.org/about/ )
What do I mean by “homegrown”? These projects are:
If you’re a small donor or earn to give, consider giving to projects like these:
1. Feature-length documentary on SB 1047
By Michael Trazzi — $16k raised of $55k
Michael has already recorded interviews with the main characters of SB 1047: sponsors like Scott Wiener and Dan Hendrycks, proponents like Zvi Mowshowitz and Holly Elmore, and opponents like Dean Ball and Timothy B Lee. Now he needs the funding to turn it into a 1-hour feature documentary. This is a rare chance to sponsor a high-quality video narrative, and share it beyond our existing ecosystem. I’ve personally donated $10k towards this, and expect Michael to be able to effectively use much more.
More info & donate here: https://manifund.org/projects/finishing-the-sb-1047-documentary-in-6-weeks
2. Overview of AI Safety in 2024
By Gavin Leech — $8k of $17.6k raised
Gavin Leech is a forecaster, researcher and founder of Arb; he’s proposing to re-rerun a 2023 survey of AI Safety. The landscape shifts pretty quickly, so I’d love to see what’s changed since last year.
As I was writing this, regrantor Neel Nanda funded it to the minimum $8k ask! Neel adds:
More info & donate here: https://manifund.org/projects/shallow-review-of-ai-safety-2024
3. Podcast series on Effective Altruism’s values
By Elizabeth Van Nostrand — $1.3k raised of $2.6k
Elizabeth & Timothy’s initial podcast was very well received, drawing extensive, thoughtful comments from a variety of folks. I’d be excited to see them continue this series, especially if they bring in folks involved with steering the EA community (like Sarah Cheng, who has extensively engaged with their points)
More info & donate here: https://manifund.org/projects/elizabeth-and-timothy-podcast-on-values-in-effective-altruism
4. Sentinel, a foresight and emergency response team
By Nuno Sempere — $16k raised of $90k
Nuno has long been one of our community’s most outspoken forecasters; now he’s working with Rai Sur to spin up an emergency response team (think: Army Reserve Corps, but for responding to existential risks). They’re already putting out a useful weekly report on biosecurity, geopolitics and other such topics.
More info & donate here: https://manifund.org/projects/fund-sentinel-for-q4-2024
5. Research on co-occurence of sparse autoencoder latents
By Matthew A. Clarke — $0 raised of $6.4k
TBH, I don’t know much about the merits for or against this line of research; I’m highlighting this grant because it’s overseen by Joseph Bloom, a past Manifund grantee who I and others have been very impressed with. If mechanistic interpretability is your jam, check this one out!
More info & donate here: https://manifund.org/projects/salaries-for-sae-co-occurrence-project
What else have we been up to?
It’s been a quiet couple of months, but here’s what Rachel and I have been busy with:
taxesForm 990 for 2023. Last year, we raised ~$3m and disbursed ~$2.6m!Cheers,
Austin
PS: want to make a larger grant? Manifund can facilitate donations via donor advised funds, crypto, and bank transfers.