This article discusses the "Proliferation by Default" dynamic that the falling price of weapons-capable AI systems will create. It covers an array of dangerous capabilities in future systems and comparisons to historical technologies. In the following parts of this series, we will explore the how the price of these offensive AI systems could fall, their strategic implications, and policy solutions.
Summary of key points:
- There are a number of technologies where even human-level assistance could allow non-state actors to acquire WMDs, especially in the development of bio, chemical, and cyber weapons.
- Narrowly superhuman AI technologies (such as those specialized in biological or robotics engineering) could contribute to the development of cheap superweapons, such as mirror
... (read 6683 more words →)
Similarly, it's worth being careful of arguments that lean heavily into longtermism or support concentration of power, because those frames can be used to justify pretty much anything. It doesn't mean we should dismiss them outright---arguments for accumulating power are and long term thinking are convincing for a reason---but you should double check whether the author has strong principles, the path to getting there, and what it's explicitly trading off against.
Re: Vitalik Buterin on galaxy brain resistance.