Couldn't we privately ask Sam Altman “I would do X if Dario and Demis also commit to the same thing”?
Seems like the obvious thing one might like to do if people are stuck in a race and cannot coordinate.
X could be implementing some mitigation measures, supporting some piece of regulation, or just coordinating to tell the president that the situation is dangerous and we really do need to do something.
What do you think?
It seems like conditional statements have already been useful in other industries - Claude
Regarding whether similar private "if-then" conditional commitments have worked in other industries:
Yes, conditional commitments have been used successfully in various contexts:
The effectiveness depends on verification mechanisms, trust between parties, and sometimes third-party enforcement. In high-stakes competitive industries like AI, coordination challenges would be significant but not necessarily insurmountable with the right structure and incentives.
(Note, this is different from “if‑then” commitments proposed by Holden, which are more about if we cross capability X then we need to do mitigation Y)
Even if this strategy would work in principle among particularly honorable humans, surely Sam Altman in particular has already conclusively proven that he cannot be trusted to honor any important agreements? See: the OpenAI board drama; the attempt to turn OpenAI's nonprofit into a for-profit; etc.
X could also be agreeing to sign a public statement about the need to do something or whatever.
Altman has already signed the CAIS Statement on AI Risk ("Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war."), but OpenAI's actions almost exclusively exacerbate extinction risk, and nowadays Altman and OpenAI even downplay the very existence of this risk.
I generally agree. But I think this does not invalidate the whole strategy - the call to action in this statement was particularly vague, I think there is ample room for much more precise statements.
My point was that Altman doesn't adhere to vague statements, and he's a known liar and manipulator, so there's no reason to believe his word would be worth any more in concrete statements.
I think he would lie, or be deceptive in a way that's not technically lying, but has the same benefits to him, if not more.