_rpd comments on Open Thread Feb 16 - Feb 23, 2016 - Less Wrong

5 Post author: Elo 15 February 2016 02:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (104)

You are viewing a single comment's thread. Show more comments above.

Comment author: G0W51 20 February 2016 07:25:54PM 3 points [-]

Is there a term for a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential? Existential risk, that is, the extinction of Earth-originating intelligent life or the drastic decrease of its potential, does not sound nearly as harmful if there are alien civilizations that become sufficiently advanced in place of Earth-originating life. However, an existential risk sounds far more harmful if it compromises all intelligent life in the universe, or if there is no other intelligent life in the universe to begin with. Perhaps this would make physics experiments more concerning than other existential risks, as even if their chance of causing the extincion of Earth-originating life is much smaller than other existential risks, their chance of eliminating all life in the universe may be higher.

Comment author: _rpd 23 February 2016 06:36:55AM 1 point [-]

I really like this distinction. The closest I've seen is discussion of existential risk from a non-anthropocentric perspective. I suppose the neologism would be panexistential risk.

Comment author: G0W51 06 March 2016 04:25:24PM 0 points [-]

Panexistential risk is a good, intuitive, name.