Levels of global catastrophes: from mild to extinction
It is important to make a bridge between existential risks and other possible risks. If we say that existential risks are infinitely more important than other risks, we put them out of scope of policymakers (as they can’t work with infinities). We could reach them if we show x-risks as extreme cases of smaller risks. It could be done for most risks (with AI and accelerator's catastrophes are notable exceptions).
Smaller catastrophes play complex role in estimating probability of x-risks. A chain of smaller catastrophes may result in extinction, but one small catastrophe could postpone bigger risks (but it is not good solution). The following table presents different levels of global catastrophes depending of their size. Numbers are mostly arbitrary and are more like placeholders for future updates.
http://immortality-roadmap.com/degradlev.pdf
This wiki article have many links: https://en.wikipedia.org/wiki/Runaway_climate_change But I would like to clarify my position: I am not a climatologist and can't independently evaluate these claims on math level, but I understand their logic and think that while runaway climate change is low probability event, we should do more to prevent it. The interesting point is similarity between two communities. The community of people who think that self-improving AI is possible and is x-risk and community of people who think that runaway warming is possible and is x-risks. The most interesting thing is that both communities choose to ignore each other. But the mechanism of risk (positive feedback) and timeframe (2030) is almost the same.
That article says:
That would be really bad but it's not the same as total extinction.
I don't think that's the timeframe of the LW community as shown by our census.