Today, the AI Extinction Statement was released by the Center for AI Safety, a one-sentence statement jointly signed by a historic coalition of AI experts, professors, and tech leaders.
Geoffrey Hinton and Yoshua Bengio have signed, as have the CEOs of the major AGI labs–Sam Altman, Demis Hassabis, and Dario Amodei–as well as executives from Microsoft and Google (but notably not Meta).
The statement reads: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”
We hope this statement will bring AI x-risk further into the overton window and open up discussion around AI’s most severe risks. Given the growing number of experts and public figures who take risks from advanced AI seriously, we hope to improve epistemics by encouraging discussion and focusing public and international attention toward this issue.
Disclaimer: I've never been to an academic conference
EDIT: Also, I'm just thinking out loud here. Not stating my desire to start a conference, just thinking about what can make academics feel like researching alignment is normal.
Those are some big names. I wonder if arranging a big AI safety conference w/ these people would make worrying about alignment feel more socially acceptable to a lot of researchers. It feels to me like a big part of making thinking about alignment socially acceptable is to visibly think about alignment in socially acceptable ways. In my imagination, you have conferences on important problems in academia.
You talk about the topic there with your colleagues and impressive people. You also go there to catch up with friends, and have a good time. You network. You listen to big names talking about X, and you wonder which of your other colleagues will also talk about X in the open. Dismissing it no longer feels like it will go uncontested. Maybe you should take care when talking about X? Maybe even wonder if it could be true.
Or on the flip side, you wonder if you can talk about X without your colleagues laughing at you. Maybe other people will back you up when you say X is important. At least, you can imply the big names will. Oh look, a big name X-thinker is coming round the corner. Maybe you can start up a conversation with them in the open.