Epictetus comments on Existential Risk and Existential Hope: Definitions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (38)
A good place to start, but I don't know about the heavy emphasis on expectation. The problems due to skewed distributions are ever-present. An event with small probability but high value will skew expected value. If a second event were to occur that rendered this impossible, we'd lose a lot of expected value. I'm not sure I'd call that a catastrophe, though.
This seems like exactly the set-up Bostrom has in mind when he talks about existential risks. We have a small chance of colonising the galaxy and beyond, but this carries a lot of our expected value. An event which prevents that would be a catastrophe.
Of course many of the catastrophes that are discussed (e.g. most life is wiped out by a comet striking the earth) coincide with drastically reducing the observed value in the short term. But we normally want to include getting stuck on a trajectory which stops further progress, even if it will be a future which involves good lives for billions of people.