Consider calling the NY governor about the RAISE Act
Summary If you live in New York, you can contact the governor to help the RAISE Act pass without being amended to parity with SB 53. Contact methods are listed at the end of the post. What is the RAISE Act? Previous discussion of the bill: * RTFB: The RAISE Act. This dives into the bill in detail, which I mostly omit from this post. * Consider donating to Alex Bores, author of the RAISE Act: this post doesn’t explain the substance of the bill more than the RTFB, but it does add some color around the passage of the bill. You can read the bill yourself: S6953B. It is dry legalese, but it is short, with the PDF clocking in around 4 pages. If you really don’t have time, a one sentence summary: the bill requires leading AI labs to explain their safety standards, and notify the government when one of their AIs does something bad, with the death of 100 people being definitely bad. I assume that we’re on the same page that this is excessively basic AI regulation that should have been passed years ago. We have SB 53, why would we want the RAISE Act? The RAISE Act has several notable differences from California’s similar SB 53 which seem better for AI safety: * It plainly states “a Large Developer shall not deploy a Frontier Model if doing so would create an unreasonable risk of Critical Harm.” SB 53 only penalizes violations of the developer’s own framework[1]. * It focuses criteria on the compute costs going into models, where SB 53 also takes into account revenue. This can cover an AI lab like Safe Superintelligence Inc. which does not plan to generate revenue in the near future, but could plausibly train a frontier model[2]. * Keep in mind that if SSI never actually deploys their hypothetical frontier model, the RAISE Act doesn’t actually come into effect. * It contains provisions for distillations, so large distillations of Frontier Models are definitely still considered Frontier Models. * It explicitly calls out IP transfers as transferring Large Devel