Can you elaborate a bit on what exactly is your intention?
Specifically, is this meant to be a scale of severity categories with one example for each, or is it meant as an exhaustive list of all relevant apocalyptic scenarios put into a ranking?
Can you elaborate a bit on what exactly is your intention?
Specifically, is this meant to be a scale of severity categories with one example for each, or is it meant as an exhaustive list of all relevant apocalyptic scenarios put into a ranking?
A scale of roughly-ordered tiers. It's a shorthand for expressing the level of devastation in far-future failure modes.
0: Recoverable Catastrophe
An apocalypse is an event that permanently damages the world. This scale is for scenarios that are much worse than any normal disaster. Even if 100 million people die in a war, the rest of the world can eventually rebuild and keep going.
1: Economic Apocalypse
The human carrying capacity of the planet depends on the world's systems of industry, shipping, agriculture, and organizations. If the planet's economic and infrastructural systems were destroyed, then we would have to rely on more local farming, and we could not support as high a population or standard of living. In addition, rebuilding the world economy could be very difficult if the Earth's mineral and fossil fuel resources are already depleted.
2: Communications Apocalypse
If large regions of the Earth become depopulated, or if sufficiently many humans die in the catastrophe, it's possible that regions and continents could be isolated from one another. In this scenario, globalization is reversed by obstacles to long-distance communication and travel. Telecommunications, the internet, and air travel are no longer common. Humans are reduced to multiple, isolated communities.
3: Knowledge Apocalypse
If the loss of human population and institutions is so extreme that a large portion of human cultural or technological knowledge is lost, it could reverse one of the most reliable trends in modern history. Some innovations and scientific models can take millennia to develop from scratch.
4: Human Apocalypse
Even if the human population were to be violently reduced by 90%, it's easy to imagine the survivors slowly resettling the planet, given the resources and opportunity. But a sufficiently extreme transformation of the Earth could drive the human species completely extinct. To many people, this is the worst possible outcome, and any further developments are irrelevant next to the end of human history.
5: Biosphere Apocalypse
In some scenarios (such as the physical destruction of the Earth), one can imagine the extinction not just of humans, but of all known life. Only astrophysical and geological phenomena would be left in this region of the universe. In this timeline we are unlikely to be succeeded by any familiar life forms.
6: Galactic Apocalypse
A rare few scenarios have the potential to wipe out not just Earth, but also all nearby space. This usually comes up in discussions of hostile artificial superintelligence, or very destructive chain reactions of exotic matter. However, the nature of cosmic inflation and extraterrestrial intelligence is still unknown, so it's possible that some phenomenon will ultimately interfere with the destruction.
7: Universal Apocalypse
This form of destruction is thankfully exotic. People discuss the loss of all of existence as an effect of topics like false vacuum bubbles, simulationist termination, solipsistic or anthropic observer effects, Boltzmann brain fluctuations, time travel, or religious eschatology.
The goal of this scale is to give a little more resolution to a speculative, unfamiliar space, in the same sense that the Kardashev Scale provides a little terminology to talk about the distant topic of interstellar civilizations. It can be important in x risk conversations to distinguish between disasters and truly worst-case scenarios. Even if some of these scenarios are unlikely or impossible, they are nevertheless discussed, and terminology can be useful to facilitate conversation.
As a cryonicist, I'm drafting out a text describing my revival preferences and requests, to be stored along with my other paperwork. (Oddly enough, this isn't a standard practice.) The current draft is here. I'm currently seeking suggestions for improvement, and a lot of the people around here seem to have good heads on their shoulders, so I thought I'd ask for comments here. Any thoughts?
Interesting idea! I guess you could add a 'when in doubt' for whether you'd rather be revived in an early period (eg, if resurrection is possible with a 80% success rate) or to be downprioritized until resurrection is very mature and safe.
I don't think the term "weird" is very conductive to having a healthy self-esteem.
'eccentric'?
turchin, i just want to say that i really like these idea-catalog infographics.
[Meta]
Update: I've received feedback, and I won't be posting links to TFP in this thread, or others, on LW.
Would it be below the bar for no-politics to post one or more links in this thread from The Future Primaeval (TFP)? Some of their posts are more overtly political or controversial than others, and the only ones posts from the site I'd link here are ones which make more direct reference to, e.g., the rationality community, metacognition, strategic thinking, etc., rather than having something to do with sociopolitics. Note: I'd prefer if those hostile to TFP links of LW would reply to this comment rather than downvoting it, but, that stated, downvotes without clarification will be treated as a negative response to my above question.
Thank you for asking.
Something should definitely be tried about downvotes: It seems like the average value in many threads is below zero.
I should probably add that i'm looking for a positive and mutually-supporting LW-style community. But i'm sure other people would prefer a more brutally honest community. That's fine and ideally we'll all find sites that suit us in the end.
Something should definitely be tried about downvotes: It seems like the average value in many threads is below zero.
Libertarianism is an irrational, politically extremist position?
I think passive_fist was saying that they considered certain comments irrational, and that those fell into the (broad) category of libertarianism. That c is an element of set I and set L, not that L is a subset of I.
"A Disneyland with no children" apocalypse where optimization competition eliminates any pleasure we get from life.
A hell apocalypse where a large numbers of sentient lifeforms are condemned to very long term suffering possibly in a computer simulation.
Yeah, i was thinking about the latter (like Pascal's Mugging) but i think it might be too exotic to fit into a linear scale.