It'd be great if yall could add a regrantor from the Cooperative AI Foundation / FOCAL / CLR / Encultured region of the research/threatmodel space. (epistemic status: conflict of interest since if you do this I could make a more obvious argument for a project)
I'm generally interested in having a diverse range of regrantors; if you'd like to suggest names/make intros (either here, or privately) please let me know!
How do you feel about people listing projects that are finished but were never funded? I think impact certificates/retroactive grants are better for epistemics, at least for the kind of work I do, and it would be great to have a place for those.
I can't speak for other regrantors, but I'm personally very sympathetic to retroactive grants for impactful work that got less funding than was warranted; we have one example for Vipul Naik's Donations List Website and hope to publish more examples soon!
Manifund is launching a new regranting program! We will allocate ~$2 million over the next six months based on the recommendations of our regrantors. Grantees can apply for funding through our site; we’re also looking for additional regrantors and donors to join.
What is regranting?
Regranting is a funding model where a donor delegates grantmaking budgets to different individuals known as “regrantors”. Regrantors are then empowered to make grant decisions based on the objectives of the original donor.
This model was pioneered by the FTX Future Fund; in a 2022 retro they considered regranting to be very promising at finding new projects and people to fund. More recently, Will MacAskill cited regranting as one way to diversify EA funding.
What is Manifund?
Manifund is the charitable arm of Manifold Markets. Some of our past work:
How does regranting on Manifund work?
Our website makes the process simple, transparent, and fast:
Differences from the Future Fund’s regranting program
Round 1: Longtermist Regrants
We’re launching with a cohort of 14 regrantors, each given a budget of $50k-$400k to direct to projects they believe will be the most impactful. We chose regrantors who are aligned with our values and prioritize mitigating global catastrophic risks, though ultimately regrantors can choose to give to projects under any cause area.
This round is backed by an anonymous donor’s contribution of $1.5 million, plus smaller grants from EA funders. Round 1 will end after this initial pool is spent, or after 6 months have passed.
Get involved with Manifund Regrants
For grantees: list your project on our site
If you are working on a longtermist project and looking for funding, you can post the details on our site here. Examples of projects we’ve funded:
We’re interested in proposals for AI safety, AI governance, forecasting, biorisk, and EA meta; we expect to best fund individuals and orgs looking for $1k-$200k.
For regrantors: apply for your own regrant budget
We’re accepting applications from people who want to join as regrantors! In some cases, we'll offer to sponsor regrantors and provide budgets, and in others we'll just offer to list regrantors so they can receive donations from other users that they can go on to donate.
For large donors: designate your own regrantors
We’re interested in anyone who would like to direct $100k+ this year through a regranting program. If that is you, reach out to
austin@manifund.org
or book a call!Why might you choose to donate via a regranting program?
For everyone: talk to us!
We welcome feedback of all kinds. Whether you’re a potential grantee, regrantor, or donor, we’d love to hear about your pain points with existing funding systems, and what kinds of projects you find exciting. Hop in our Discord and come chat with us, or comment on specific projects through our site!