Series: How to Purchase AI Risk Reduction
Here is yet another way to purchase AI risk reduction...
Much of the work needed for Friendly AI and improved algorithmic decision theories requires researchers to invent new math. That's why the Singularity Institute's recruiting efforts have been aimed a talent in math and computer science. Specifically, we're looking for young talent in math and compsci, because young talent is (1) more open to considering radical ideas like AI risk, (2) not yet entrenched in careers and status games, and (3) better at inventing new math (due to cognitive decline with age).
So how can the Singularity Institute reach out to young math/compsci talent? Perhaps surprisingly, Harry Potter and the Methods of Rationality is one of the best tools we have for this. It is read by a surprisingly large proportion of people in math and CS departments. Here are some other projects we have in the works:
- Run SPARC, a summer program on rationality for high school students with exceptional math ability. Cost: roughly $30,000. (There won't be classes on x-risk at SPARC, but it will attract young talent toward efficient altruism in general.)
- Print copies of the first few chapters of HPMoR cheaply in Taiwan, ship them here, distribute them to leading math and compsci departments. Cost estimate in progress.
- Send copies of Global Catastrophic Risks to lists of bright young students. Cost estimate in progress.
Here are some things we could be doing if we had sufficient funding:
- Sponsor and be present at events where young math/compsci talent gathers, e.g. TopCoder High School and the International Math Olympiad. Cost estimate in progress.
- Cultivate a network of x-risk reducers with high mathematical ability, build a database of conversations for them to have with strategically important young math/compsci talent, schedule those conversations and develop a pipeline so that interested prospects have a "next person" to talk to. Cost estimate in progress.
- Write Open Problems in Friendly AI, send it to interested parties so that even those who don't think AI risk is important will at least see "Ooh, look at these sexy, interesting problems I could work on!"
That's cool and a good intro, but you could also have a list of weaker suggestions over ten times that size to show people what sorts of advanced maths &c. might or might not end up being relevant. E.g., a summary paper from the literature on abstract machines, or even extremely young, developing subfields such as quantum algorithmic information theory that teach relevant cognitive-mathematical skills even if they're not quite fundamental to decision theory. This is also a sly way to interest people from diverse advanced disciplines. Is opportunity cost the reason such a list isn't around? My apologies if this question is missing the point of the discussion, and I'm sorry it's only somewhat related to the post, which is an important topic itself.