The prospect of a dangerous collection of existential risks and risks of major civilizational-level catastrophes in the 21st century, combined with a distinct lack of agencies whose job it is to mitigate against such risks probably indicates that the world might be in something of an emergency at the moment. Firstly, what do we mean by risks? Well, Bostrom has a paper on existential risks, and he lists the following risks as being "most likely":
- Deliberate misuse of nanotechnology,
- Nuclear holocaust,
- Badly programmed superintelligence,
- Genetically engineered biological agent,
- Accidental misuse of nanotechnology (“gray goo”),
- Physics disasters,
- Naturally occurring disease,
- Asteroid or comet impact,
- Runaway global warming,
- Resource depletion or ecological destruction,
- Misguided world government or another static social equilibrium stops technological progress,
- “Dysgenic” pressures (We might evolve into a less brainy but more fertile species, homo philoprogenitus “lover of many offspring”)
- Our potential or even our core values are eroded by evolutionary development,
- Technological arrest,
- Take-over by a transcending upload,
- Flawed superintelligence,
- [Stable] Repressive totalitarian global regime,
- Hanson's cosmic locusts scenario [Added by author]
To which I would add various possibilities for major civilization-level disasters that aren't existential risks, such as milder versions of all of the above, or the following:
- convergence of computer viruses and cults/religions,
- advanced personal weapons or surveillance devices such as nanotech, micro-UAV bugs (cyberpunk dystopia),
- erosion of privacy and freedom through massively oppressive government,
- highly effective meta-religions such as Scientology or a much more virulent version of modern evangelical Christianity
This collection is daunting, especially given that the human race doesn't have any official agency dedicated to mitigating risks to its own medium-long term survival. We face a long list of challenges, and we aren't even formally trying to mitigate many of them in advance, and in many past cases, mitigation of risks occurred on a last-minute, ad-hoc basis, such as individuals in the cold war making the decision not to initiate a nulcear exchange, particularly in the Cuban missile crisis.
So, a small group of people have realized that the likely outcome of a large and dangerous collection of risks combined with a haphazard, informal methodology for dealing with risks (driven by the efforts of individuals, charities and public opinion) is that one of these potential risks will actually be realized - killing many or all of us or radically reducing our quality of life. This coming disaster is ultimately not the result of any one particular risk, but the result of the lack of a powerful defence against risks.
One could argue that I [and Bostrom, Rees, etc] are blowing the issue out of proportion. We have survived so far, right? (Wrong, actually - anthropic considerations indicate that survival so far is not evidence that we will survive for a lot longer, and technological progress indicates that risks in the future are worse than risks in the past). Major civilizational disasters have already happened many, many times over.
Most ecosystems that ever existed were wiped out by natural means, almost all species that have ever existed have gone extinct, and without human intervention most existing ecosystems will probably be wiped out within a 100 million year timescale. Most civilizations that ever existed, collapsed. Some went really badly wrong, like communist Russia. Complex, homeostatic objects that don't have extremely effective self-preservation systems empirically tend to get wiped by the churning of the universe.
Our western civilization lacks an effective long-term (order of 50 years plus) self-preservation system. Hence we should reasonably expect to either build one, or get wiped out, because we observe that complex systems which seem similar to societies today - such as past societies - collapsed.
And even though our society does have short-term survival mechanisms such as governments and philanthropists, they often behave in superbly irrational, myopic or late-responding ways. It seems that the response to the global warming problem (late-responding, weak, still failing to overcome co-ordination problems) or the invasion of Iraq (plain irrational) are cases in point from recent history, and that there are numerous examples from the past, such as close calls in the cold war, and the spectacular chain of failures that led from world war I to world war II and the rise of Hitler.
This article could be summarized as follows:
The systems we have for preserving the values and existence of our western society, and the human race as a whole are weak, and the challenges of the 21st-22nd century seem likely to overwhelm them.
I originally wanted to write an article about ways to mitigate existential risks and major civilization-level catastrophes, but I decided to first establish that there are actually such things as serious existential risks and major civilization-level catastrophes, and that we haven't got them handled yet. My next post will be about ways to mitigate existential risks.
We're much safer against even very rare natural disasters like Toba (and others that act through climate) than it was historically. The kind of disaster that could wipe as out gets less and less probable every decade. I'm not even sure if the kind of asteroid that wiped out dinosaurs would be enough to wipe out humanity now, given a few years of prior warning (well, it would kill most people, but that's not even close to getting rid of the entire humanity).
I seriously dispute the idea that we were very close to nuclear war. I even more seriously dispute the idea that it would have any long term effects on human civilization if it happened. Even in the middle of WW2 people's life expectancy was far higher than historically typical, violence death rates were far lower, and I'd even take a guess that average personal freedoms compared quite well to the historical record.
Whether those catastrophes could destroy present humanity wasn't the point, which was whether or not near misses in potential extinction events have ever occurred during our past.
Consider it that way : under your assumptions of our world being more robust nowadays, what would count as a near miss today, would certainly have wiped the frailer humanity out back then; conversely what counted as a near miss back then, would not be nearly that bad nowadays. This basically means, by constraining the definition of a "near miss" in that way, that it is... (read more)