This is a linkpost for https://grants.futureoflife.org/

Epistemic status: describing fellowships that I am helping with the administration of.

Edit 2021-10-04: Modified to reflect changed eligibility+stipend conditions.

The Future of Life Institute is launching new PhD and postdoctoral fellowships to study AI existential safety: that is, research that analyzes the most probable ways in which AI technology could cause an existential catastrophe, and which types of research could minimize existential risk; and technical research which could, if successful, assist humanity in reducing the existential risk posed by highly impactful AI technology to extremely low levels.

The Vitalik Buterin PhD Fellowship in AI Existential Safety is targeted at students applying to start their PhD in 2022, or existing PhD students who would not otherwise have funding to work on AI existential safety research. Quoting from the page:

At universities in the US, UK, or Canada, annual funding will cover tuition, fees, and the stipend of the student's PhD program up to $40,000, as well as a fund of $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the stipend amount will be adjusted to match local conditions. Fellows will also be invited to workshops where they will be able to interact with other researchers in the field.

In addition, applicants who are short-listed for the Fellowship will be reimbursed for application fees for up to 5 PhD programs, and will be invited to an information session about research groups that can serve as good homes for AI existential safety research.

Applications for the PhD fellowship close on October the 29th.

The Vitalik Buterin Postdoctoral Fellowship in AI Existential Safety is for postdoctoral appointments starting in fall 2022. Quoting from the page:

For host institutions in the US, UK, or Canada, the Fellowship includes an annual $80,000 stipend and a fund of up to $10,000 that can be used for research-related expenses such as travel and computing. At universities not in the US, UK or Canada, the fellowship amount will be adjusted to match local conditions.

Applications for the postdoctoral fellowship close on November the 5th.

You can apply at grants.futureoflife.org, and if you know people who may be good fits, please help spread the word!

New Comment
2 comments, sorted by Click to highlight new comments since:

Are astronomical suffering risks (s-risk) considered a subset of existential risks (x-risk) because they "drastically curtail humanity’s potential"? Or is this concern not taken into account for this research program?

I suppose that types of s-risks that did drastically curtail humanity's potential would count, but s-risks that don't have that issue (e.g. humanity decides to suffer massively, but still has the potential to do lots of other things) would not.