An existential risk (or x-risk) is a risk that poses astronomically large negative consequences for humanity, such as human extinction or permanent global totalitarianism.
Nick Bostrom introduced the term "existential risk" in his 2002 paper "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards."1 In the paper, Bostrom defined an existential risk as:
...One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.