How are "global computer failures" an existential risk? Sure, it would suck, but it wouldn't be the end of the world.
And what are "physics threats"?
I would also like to see a column with strategies for mitigating the thread, beyond "requires global coordination". For example the solution against bioweapons would be regulation and maybe countermeasure research, while against supernovae there isn't much we can do.
How are "global computer failures" an existential risk? Sure, it would suck, but it wouldn't be the end of the world.
Global trade depends on computers these days, and the human population depends on global trade to get food, medicine, building materials, technology parts, etc.; even if all humans would not be instantly killed by a global computer failure, it could stall or stop expansion.
And what are "physics threats"?
Vacuum metastability event, for instance?
Due to my colleague, Anders Sandberg: