This also continues the trend of OAI adding highly credentialed people who notably do not have technical AI/ML knowledge to the board.
This fact will be especially important insofar as a situation arises where e.g. some engineers at the company think that the latest system isn't safe. Board won't be able to engage with the arguments or evidence, it'll all come down to who they defer to.
Are board members working full-time on being board members at OpenAI? If so, I would expect that they could take actions to alleviate their lack of technical expertises by spending 15h/week to get up to speed on the technical side, reading papers and maybe learning to train LMs themselves. It naively seems like AI is sufficiently shallow that 15h/week is enough to get most of the expertise you need within a year.
They almost certainly are not. In none of the OA Form 990s up to ~2022 or so are board members listed as working more than an hour or two per week (as board members - obviously board members who were working for the OA corporation, like Sutskever/Altman/Brockman, presumably are working fulltime there). For example in the 2022 Form 990, Zillis and Hurd etc list 3 hours per week. (And from descriptions of the board's activity in 2022, this is probably fictional: I have no idea how they supposedly all spent >156 hours on OA in 2022...) This is also standard for non-profits; offhand, I can't think of any non-profits where non-employee board members work 40 hours a week as board members.
So it would be quite a change if any non-employee board members were working fulltime now. And in Nakasone's case, he appears to have plenty on his plate already:
On May 8th, 2024, Nakasone was named Founding Director of Vanderbilt University's new Institute for National Defense and Global Security. Nakasone will also hold a Research Professorship within Vanderbilt's School of Engineering, as well as serving as special advisor to the chancellor.[36] In addition, on May 10th, 2024, Nakasone was elected to the board of trustees of Saint John's University, his alma mater.[37]
That’s true for for-profit boards too—to pick a random example, here’s the Microsoft Board, pretty much everybody seems to have a super intense day job and/or to serve on 4+ company boards simultaneously.
It could indicate the importance of security, which is safe. Or of escalation in a military arms race, which is unsafe.
I may just be cynical, but this looks a lot more like a way to secure US military and intelligence agency contracts for OpenAI's products and services as opposed to competitors rather than actually about making OAI more security focused.
This is only a few months after the change regarding military usage: https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt/
Now suddenly the recently retired head of the world's largest data siphoning operation is appointed to the board for the largest data processing initiative in history?
Yeah, sure, it's to help advise securing OAI against APTs. 🙄
I thought this too, until someone in finance told me to google "Theranos Board of Directors", so I did, and it looked a lot like OpenAI's new board.
This provides an alternate hypothesis: That signals nothing substantial. Perhaps it's empty credentialism, or empty PR, or a cheap attempt to win military contacts.
This post probably should have mentioned that Paul Nakasone is a former NSA director.
A more cynical interpretation of this news is that it represents a deal that gives OpenAI favorable access to US government contracts and some protection from safety-related regulation in exchange for ensuring the NSA will have access to user data going forward.
One of my fascinations is when/how the Department of Defense starts using language models, and I can't help but read significance into this from that perspective. If OpenAI wants that sweet sweet defense money, having a general on your board is a good way to make inroads.
Whether this was influenced by Aschenbrenner's Situational Awareness or not, it's welcome to see OpenAI emphasizing the importance of security. It's unclear how much this is a gesture vs reflective of deeper changes.