Thanks. I sent in a letter.
I'm confused about the "it's gotta be a PDF" but, I guess arbitrary bureaucracies gotta arbitrary bureaucracy.
Make a letter addressed to Governor Newsom using the template here.
For convenience, here is the template:
September [DATE], 2024
The Honorable Gavin Newsom
Governor, State of California
State Capitol, Suite 1173
Sacramento, CA 95814
Via leg.unit@gov.ca.govRe: SB 1047 (Wiener) – Safe and Secure Innovation for Frontier Artificial Intelligence Models Act – Request for Signature
Dear Governor Newsom,
[CUSTOM LETTER BODY GOES HERE. Consider mentioning:
- Where you live (this is useful even if you don’t live in California)
- Why you care about SB 1047
- What it would mean to you if Governor Newsom signed SB 1047
SAVE THIS DOCUMENT AS A PDF AND EMAIL TO leg.unit@gov.ca.gov
]Sincerely,
[YOUR NAME]
Posting something about a current issue that I think many people here would be interested in. See also the related EA Forum post.
California Governor Gavin Newsom has until September 30 to decide the fate of SB 1047 - one of the most hotly debated AI bills in the world. The Center for AI Safety Action Fund, where I work, is a co-sponsor of the bill. I’d like to share how you can help support the bill if you want to.
About SB 1047 and why it is important
SB 1047 is an AI bill in the state of California. SB 1047 would require the developers of the largest AI models, costing over $100 million to train, to test the models for the potential to cause or enable severe harm, such as cyberattacks on critical infrastructure or the creation of biological weapons resulting in mass casualties or $500 million in damages. AI developers must have a safety and security protocol that details how they will take reasonable care to prevent these harms and publish a copy of that protocol. Companies who fail to perform their duty under the act are liable for resulting harm. SB 1047 also lays the groundwork for a public cloud computing resource to make AI research more accessible to academic researchers and startups and establishes whistleblower protections for employees at large AI companies.
So far, AI policy has relied on government reporting requirements and voluntary promises from AI developers to behave responsibly. But if you think voluntary commitments are insufficient, you will probably think we need a bill like SB 1047.
If SB 1047 is vetoed, it’s plausible that no comparable legal protection will exist in the next couple of years, as Congress does not appear likely to pass anything like this any time soon.
The bill’s text can be found here. A summary of the bill can be found here. Longer summaries can be found here and here, and a debate on the bill is here. SB 1047 is supported by many academic researchers (including Turing Award winners Yoshua Bengio and Geoffrey Hinton), employees at major AI companies and organizations like Imbue and Notion. It is opposed by OpenAI, Google, Meta, venture capital firm A16z as well as some other academic researchers and organizations. After a recent round of amendments, Anthropic said “we believe its benefits likely outweigh its costs.”
SB 1047 recently passed the California legislature, and Governor Gavin Newsom has until September 30th to sign or veto it. Newsom has not yet said whether he will sign it or not, but he is being lobbied hard to veto it. The Governor needs to hear from you.
How you can help
If you want to help this bill pass, there are some pretty simple steps you can do to increase that probability, many of which are detailed on the SB 1047 website.
The most useful thing you can do is write a custom letter. To do this:
Once you’ve written your own custom letter, you can also think of 5 family members or friends who might also be willing to write one. Supporters from California are especially helpful, as are parents and people who don’t typically engage on tech issues. Then help them write it! You can:
Organize an event! Consider organizing a letter writing event to help get even more letters. Please email thomas@safe.ai if you are interested.