For example, there is a substantial existential risk posed by global catastrophic risks from which civilization never fully recovers. (This GiveWell article sums things up nicely: http://blog.givewell.org/2015/08/13/the-long-term-significance-of-reducing-global-catastrophic-risks/ )
The article at your link defines a GCR as
Level 1 event: A continuous chain of events involving the deaths of hundreds of millions of people, such as an extreme pandemic.
I can't imagine how the death of, say, 10% of the global population is something from which civilization never (!) fully recovers.
Yeah, that claim definitely requires a bit of an explanation. Let me try and show you where I'm coming from with this.
Our current global economy is structured around growth, with many large feedback mechanisms in place to promote growth. Not all economies work that way.
The Romans, for example, were excellent at recognizing and adopting technology and ideas other civilizations had. However, just about the only actual invention they made was concrete. After Rome fell, there was a dark-age period where much of the knowledge that that the Romans had gathered w...
While thinking about my own next career steps, I've been writing down some of my thoughts about what's in an impactful career.
In the process, I wrote an introductory report on what seem to me to be practical approaches to problems in catastrophic risks. It's intended to complement the analysis that 80,000 Hours provides by thinking about what general roles we ought to perform, rather than analysing specific careers and jobs, and by focusing specifically on existential risks.
I'm happy to receive feedback on it, positive and negative.
Here it is: Reducing Catastrophic Risks, A Practical Introduction.