Review

It sometimes happens that people who I've talked to or worked with ask me where they should go for financial support for their research. I haven't developed a standard list of answers to this question. It seems to me like there are a lot of new orgs recently, and I'm losing track! 

If you are looking for such applicants or know someone who is looking, consider replying as an answer (or sending me a PM if that makes more sense for whatever reason).

Review

45

Ω 23

New Answer
New Comment

5 Answers sorted by

mruwnik

140

There is a Stampy answer to that which should stay up to date here.

LawrenceC

Ω6144

The main funders are LTFF, SFF/Lightspeed/other S-process stuff from Jaan Tallinn, and Open Phil. LTFF is the main one that solicits independent researcher grant applications.

There's a lot of orgs, off the top of my head, there's Anthropic/OpenAI/GDM as the scaling labs with decent-sized alignment teams, and then there's a bunch of smaller/independent orgs:

  • Alignment Research Center
  • Apollo Research
  • CAIS
  • CLR
  • Conjecture
  • FAR
  • Orthogonal
  • Redwood Research

And there's always academia.

(I'm sure I'm missing a few though!)

(EDIT: added in RR and CLR)

Redwood Research?

2LawrenceC
I don't think they're hiring, but added. 

Center on Long-term Risk (CLR)

In France, EffiSciences is looking for new members and interns.

Chipmonk

40

Very surprised that you don't have a regranting budget! I don't know which funder I would expect to do that, but I would've expected this to be more common.

I guess Jan Tallinn does this, and Manifund does this. Hmm.  

stavros

30

Depending on the kind of support they're looking for https://ceealar.org could be an option. At any one time there are a handful of people staying there working independently on AI Safety stuff.