You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Seven Apocalypses

2 scarcegreengrass 20 September 2016 02:59AM

0: Recoverable Catastrophe

An apocalypse is an event that permanently damages the world. This scale is for scenarios that are much worse than any normal disaster. Even if 100 million people die in a war, the rest of the world can eventually rebuild and keep going.


1: Economic Apocalypse

The human carrying capacity of the planet depends on the world's systems of industry, shipping, agriculture, and organizations. If the planet's economic and infrastructural systems were destroyed, then we would have to rely on more local farming, and we could not support as high a population or standard of living. In addition, rebuilding the world economy could be very difficult if the Earth's mineral and fossil fuel resources are already depleted.


2: Communications Apocalypse

If large regions of the Earth become depopulated, or if sufficiently many humans die in the catastrophe, it's possible that regions and continents could be isolated from one another. In this scenario, globalization is reversed by obstacles to long-distance communication and travel. Telecommunications, the internet, and air travel are no longer common. Humans are reduced to multiple, isolated communities.


3: Knowledge Apocalypse

If the loss of human population and institutions is so extreme that a large portion of human cultural or technological knowledge is lost, it could reverse one of the most reliable trends in modern history. Some innovations and scientific models can take millennia to develop from scratch.


4: Human Apocalypse

Even if the human population were to be violently reduced by 90%, it's easy to imagine the survivors slowly resettling the planet, given the resources and opportunity. But a sufficiently extreme transformation of the Earth could drive the human species completely extinct. To many people, this is the worst possible outcome, and any further developments are irrelevant next to the end of human history.

 

5: Biosphere Apocalypse

In some scenarios (such as the physical destruction of the Earth), one can imagine the extinction not just of humans, but of all known life. Only astrophysical and geological phenomena would be left in this region of the universe. In this timeline we are unlikely to be succeeded by any familiar life forms.


6: Galactic Apocalypse

A rare few scenarios have the potential to wipe out not just Earth, but also all nearby space. This usually comes up in discussions of hostile artificial superintelligence, or very destructive chain reactions of exotic matter. However, the nature of cosmic inflation and extraterrestrial intelligence is still unknown, so it's possible that some phenomenon will ultimately interfere with the destruction.


7: Universal Apocalypse

This form of destruction is thankfully exotic. People discuss the loss of all of existence as an effect of topics like false vacuum bubbles, simulationist termination, solipsistic or anthropic observer effects, Boltzmann brain fluctuations, time travel, or religious eschatology.


The goal of this scale is to give a little more resolution to a speculative, unfamiliar space, in the same sense that the Kardashev Scale provides a little terminology to talk about the distant topic of interstellar civilizations. It can be important in x risk conversations to distinguish between disasters and truly worst-case scenarios. Even if some of these scenarios are unlikely or impossible, they are nevertheless discussed, and terminology can be useful to facilitate conversation.

Mini advent calendar of Xrisks: nuclear war

5 Stuart_Armstrong 04 December 2012 11:13AM

The FHI's mini advent calendar: counting down through the big five existential risks. The first one is an old favourite, forgotten but not gone: nuclear war.

Nuclear War
Current understanding: medium-high
Most worrying aspect: the missiles and bombs are already out there
It was a great fear during the fifties and sixties; but the weapons that could destroy our species lie dormant, not destroyed. 

There has been some recent progress: the sizes of the arsenals have been diminishing, fissile material is more under control than it used to be, and there is more geo-political peace and cooperation.

But nuclear weapons still remain the easiest method for our species to destroy itself. Recent modelling have confirmed the old idea of nuclear winter: soot rising from burning human cities destroyed by nuclear weapons could envelop the world in a dark cloud, disrupting agriculture and the food supplies, and causing mass starvation and death far beyond the areas directly hit. And a creeping proliferation has spread these weapons to smaller states in unstable areas of the world, increasing the probability that nuclear weapons could get used, leading to potential escalation. The risks are not new, and several times (the Cuban missile crisis, the Petrov incident) our species has been saved from annihilation by the slimmest of margins. And yet the risk seems to have slipped off the radar for many governments: emergency food and fuel reserves are diminishing, and we have few “refuges” designed to ensure that the human species could endure a major nuclear conflict.

[LINK] Nuclear winter: a reminder

4 Stuart_Armstrong 19 March 2012 11:48AM

Just a reminder that some of the old threats are still around (and hence that AI is not only something that can go hideously badly, but also some thing that could help us with the other existential risks as well):

http://blog.practicalethics.ox.ac.uk/2012/03/old-threats-never-die-they-fade-away-from-our-minds-nuclear-winter/

EDIT: as should have been made clear in that post (but wasn't!), the existential risks doesn't come from the full fledged nuclear winter directly, but from the collapse of human society and fragmentation of the species into small, vulnerable subgroups, with no guarantee that they'd survive or ever climb back to a technological society.

Future of Humanity?

-17 RickJS 24 May 2011 09:46PM

I first attempted to post this in 2009, but bounced off the karma wall.  Since then, MY forgetfulness and procrastination have been its nemesis.

I invite you to listen (read) in an unusual way. "Consider it": think WITH this idea for a while. There will be plenty of time to refute it later. I find that, if I START with, "That's so wrong!", I really weaken my ability to "pan for the gold".

Remember the Swamp!

http://en.wiktionary.org/wiki/when_you're_up_to_your_neck_in_alligators,_it's_easy_to_forget_that_the_initial_objective_was_to_drain_the_swamp

I looked over the tag cloud and didn't see:

  • Existential Risk
  • War
  • Aggression
  • Competitveness
  • Territorialism
  • Nuclear arsenals

continue reading »