This post is a linkpost for my substack article.
Substack link: https://coordination.substack.com/p/alignment-is-not-enough

I think alignment is necessary, but not enough for AI to go well.
By alignment, I mean us being able to get an AI to do what we want it to do, without it trying to do things basically nobody would want, such as amassing power to prevent its creators from turning it off. By AGI, I mean something that can do any economically valuable task that can be done through a computer about as well as a human can or better, such as scientific research. The necessity of alignment won’t be my focus here, so I will take it as a given.
This notion... (read 3200 more words →)