You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

twanvl comments on Breakdown of existential risks - Less Wrong Discussion

16 Post author: Stuart_Armstrong 23 November 2012 02:12PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread.

Comment author: twanvl 24 November 2012 11:17:43AM 0 points [-]

How are "global computer failures" an existential risk? Sure, it would suck, but it wouldn't be the end of the world.

And what are "physics threats"?

I would also like to see a column with strategies for mitigating the thread, beyond "requires global coordination". For example the solution against bioweapons would be regulation and maybe countermeasure research, while against supernovae there isn't much we can do.

Comment author: fubarobfusco 24 November 2012 02:02:49PM *  2 points [-]

How are "global computer failures" an existential risk? Sure, it would suck, but it wouldn't be the end of the world.

Global trade depends on computers these days, and the human population depends on global trade to get food, medicine, building materials, technology parts, etc.; even if all humans would not be instantly killed by a global computer failure, it could stall or stop expansion.

And what are "physics threats"?

Vacuum metastability event, for instance?

Comment author: evand 25 November 2012 10:02:57PM 1 point [-]

I can see a global computer catastrophe rising to the level of civilization-ending, and 90-99% fatality rate, if I squint hard enough. I could see the fatality rate being even higher if it happens farther in the future. I'm having trouble seeing it as an existential risk, that literally kills enough people that there is no viable population remaining anywhere. Even in the case of computer catastrophe as malicious event, I'm having trouble envisioning an existential risk that doesn't also include one of the other options.

Are there papers that make the case for computer catastrophe as X-risk?

Comment author: fubarobfusco 26 November 2012 01:48:57AM 3 points [-]

Rather than considering it in terms of fatality rate, consider it in terms of curtailing humanity's possible expansion into the universe. The Industrial Revolution was possible because of abundant coal, and the 20th century's expansion of technology was possible because of petroleum. The easy-access coal and oil are used up; the resources being used today would not be accessible to a preindustrial or newly industrial civilization. So if our civilization falls and humanity reverts to preindustrial conditions, it stays there.