Today, the AI Extinction Statement was released by the Center for AI Safety, a one-sentence statement jointly signed by a historic coalition of AI experts, professors, and tech leaders.
Geoffrey Hinton and Yoshua Bengio have signed, as have the CEOs of the major AGI labs–Sam Altman, Demis Hassabis, and Dario Amodei–as well as executives from Microsoft and Google (but notably not Meta).
The statement reads: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”
We hope this statement will bring AI x-risk further into the overton window and open up discussion around AI’s most severe risks. Given the growing number of experts and public figures who take risks from advanced AI seriously, we hope to improve epistemics by encouraging discussion and focusing public and international attention toward this issue.
I disagree it's that easy. It's not a long trajectory of inevitability; like with evolution, there are constraints. Each step generally has to be on its own aligned with economic incentives at the time. See how for example steam power was first developed to fuel pumps removing water from coal mines; the engines were so inefficient that it was only cost effective if you didn't also need to transport the coal. Now we've used up all surface coal and oil, not to mention screwed up the climate quite a bit for the next few millennia, conditions are different. I think technology is less uniform progression and more a mix of "easy" and "hard" events (as in the grabby aliens paper, if you've read it), and by exhausting those resources we've made things harder. I don't think climbing back up would be guaranteed.
This IMO even if it was possible would solve nothing while potentually causing an inordinate amount of suffering. And it's also one of those super long term investments that don't align with almost any incentive on the short term. I say it solves nothing because intelligence wouldn't be the bottleneck; if they had any books left lying around they'd have a road map to tech, and I really don't think we've missed some obvious low tech trick that would be relevant to them. The problem is having the materials to do those things and having immediate returns.