Good finds!
I think they are headed in the right direction, but I'm skeptical of the usefulness their work on complexity. The metrics ignore computational complexity of the model, and assume all the variance is modeled based on sources like historical data and expert opinion. It's also not at all useful unless we can fully characterize the components of the system, which isn't usually viable.
It also seems to ignore the (in my mind critical) difference between "we know this is evenly distributed in the range 0-1" and "we have no idea what the distribution of this is over the space 0-1." But I may be asking for too much in a complexity metric.
My default assumption is that the metrics themselves are useless for AI purposes, but I think the intuitions behind their development might be fruitful.
I also observe that the software component of this process is stuff like complicated avionics software, used to being tested under adversarial conditions. It seems likely to me that if a dangerous AI were to be built using modern techniques like machine learning, it would probably be assembled in a process broadly similar to this.
I periodically look for information on systems engineering. This time I came across a powerpoint presentation from the MIT Open Courseware course Fundamentals of Systems Engineering. Professor de Weck, who taught the course, had done some research on state-of-the-art methods developed as part of DARPA's META Program.
A few years ago DARPA wrapped up the program, designed to speed up delivery of cyber-electro-mechanical systems (war machines) by 5x. Since the parent program Adaptive Vehicle Make seems to have concluded without producing a vehicle, I infer the META Program lost its funding at the same time.
The work it produced appears to be adjacent to our interests along several dimensions though, so I thought I would bring it to the community's attention. The pitch for the program, taken from the abstract of de Weck's paper:
Which is to say they very carefully architect the system, use known-to-be-good components, and employ formal verification to catch problems early. In the paper a simulation of the META workflow successfully achieved a 4.4x development speedup compared to the same project's actual development using traditional methods.
There are a bunch of individual directions explored which are of interest. Some that struck me were:
It looks like the repository for the program's outputs is here. A missile engineer trying to apply the method to healthcare is here. Reducing healthcare costs is rocket science!
In a nutshell it contains a whole bunch of things we have long discussed all in a single package, and now there are a few people out and about trying to get parts into practice.