It is much less important that we proceed with caution - making sure to choose our words carefully or not interacting with antagonistic reporters - than that we just keep getting media coverage.
I strongly prefer to speak the truth than to get coverage, so I disagree with at least the first half of this advice.
If someone is talking to you about current events and new information is being given to you, and you're being invited to comment on it, and there's strong pressures on you to say particular things that powerful forces want you to say, and play certain social roles that the media wants you to play (e.g. "them vs us" to the AI labs, or to endorse radical and extreme versions of your position), in my model of the world it then takes a bunch of cognitive effort to not fall into the roles people want and say things you later regret or didn't even believe at the time, while still making important and valid points.
The broad advice to care much less about choosing your words carefully sounded to me like it pushed against this, and against one of the "core competencies" of rationalists, so to speak. My favorite world where there are rationalists in the eye of sauron involves rationalists speaking like rationalists, making thoughtful arguments well, and very obviously not like political hacks with a single political goal who are just saying stuff to make drama. When your issue is "hot", it is not time to change how you fundamentally talk about your issue! Eliezer's essay on the TIME site does not sound different to how Eliezer normally sounds, and that's IMO a key element of why people respect him, and if he spoke far more carelessly than usual then it would have been worse-written and people would have respected it less.
I disagree with the advice your post gives, and don't know think that the advice is good and even worse you didn't argue for your points much or acknowledge literally any counterarguments. I don't think attention has generically been good for lots of things — people get cancelled, global warming gets politicized, etc. You didn't mention the obvious considerations of "How would this get politicized if it gets politicized, and how could that be avoided?" or "What adversarial forces will try to co-opt the discussion of extinction risk?" or "How could this backfire and damage the extinction-prevention efforts" or consider literally any counterarguments to your position. I think the advice to feel comfortable talking to adversarial reporters is pretty bad. I think it's pretty easy for someone who is outside of the establishment to get targeted by adversarial journalists as "to be ostracized and ousted" by a series of terrible questions, and that's something you want to actively avoid and instead focus on talking with figures who you think have basic respect for the argumentative process.
If you have a policy proposal that you'd like as a top-level post, please make arguments, consider counterarguments, and if you aren't going to do that then please avoid writing things that seem like they're encouraging dropping basic discourse norms in order to grab attention and power.
No, it does not say that either. I’m assuming you’re referring to “choose our words carefully”, but stating something imprecisely is a far ways from not telling the truth.
Pardon me. I wrote a one-line reply, and then immediately edited it to make sure I was saying true sentences, and it ended up being much longer.
(Perhaps my mistake here is reflective of the disagreement in the post about speaking carelessly.)
Great interview with Stuart Russell this past week on CNN's Smerconish: https://edition.cnn.com/videos/tech/2023/04/01/smr-experts-demand-pause-on-ai.cnn
Crossposted from the EA Forum
AI Safety is hot right now.
The FLI letter was the catalyst for most of this, but even before that there was the Ezra Klein OpEd piece in the NYTimes. (Also general shoutout to Ezra for helping bring EA ideas to the mainstream - he's great!).
Since the FLI letter, there was there was this CBS interview with Geoffrey Hinton. There was this WSJ Op-Ed. Eliezer's Time OpEd and Lex Fridman interview led to Bezos following him on Twitter. Most remarkably to me, Fox News reporter Peter Doocey asked a question in the White House press briefing, which got a serious (albeit vague) response. The president of the United States, in all likelihood, has heard of AI Safety.
This is amazing. I think it's the biggest positive development is AI Safety thus far. On the safety research side, the more people hear about AI safety, the more tech investors/philanthropists start to fund research and the more researchers want to start doing safety work. On the capabilities side, companies taking AI risks more seriously will lead to more care taken when developing and deploying AI systems. On the policy side, politicians taking AI risk seriously and developing regulations would be greatly helpful.
Now, I keep up with news... obsessively. These types of news cycles aren't all that uncommon. What is uncommon is keeping attention for an extended period of time. The best way to do this is just to say yes to any media coverage. AI Safety communicators should be going on any news outlet that will have them. Interviews, debates, short segments on cable news, whatever. It is much less important that we proceed with caution - making sure to choose our words carefully or not interacting with antagonistic reporters - than that we just keep getting media coverage. This was notably Pete Buttigieg's strategy in the 2020 Democratic Primary (and still is with his constant Fox News cameos), which led to this small-town mayor becoming a household name and the US Secretary of Transportation.
I think there's a mindset among people in AI Safety right now that nobody cares and nobody is prepared and our only chance is if we're lucky and alignment isn't as hard as Eliezer makes it out to be. This is our chance to change that. Never underestimate the power of truckloads of media coverage, whether to elevate a businessman into the white house or to push a fringe idea into the mainstream. It's not going to come naturally, though - we must keep working at it.