The Regulatory Option: A response to near 0% survival odds
This is inspired by Eliezer’s “Death with Dignity” post. Simply put, AI Alignment has failed. Given the lack of Alignment technology AND a short timeline to AGI takeoff, chances of human survival have dropped to near 0%. This bleak outlook only considers one variable (the science) as a lever for...
Apr 11, 202246