You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

G0W51 comments on Open Thread Feb 16 - Feb 23, 2016 - Less Wrong Discussion

5 Post author: Elo 15 February 2016 02:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (104)

You are viewing a single comment's thread.

Comment author: G0W51 20 February 2016 07:25:54PM 3 points [-]

Is there a term for a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential? Existential risk, that is, the extinction of Earth-originating intelligent life or the drastic decrease of its potential, does not sound nearly as harmful if there are alien civilizations that become sufficiently advanced in place of Earth-originating life. However, an existential risk sounds far more harmful if it compromises all intelligent life in the universe, or if there is no other intelligent life in the universe to begin with. Perhaps this would make physics experiments more concerning than other existential risks, as even if their chance of causing the extincion of Earth-originating life is much smaller than other existential risks, their chance of eliminating all life in the universe may be higher.

Comment author: _rpd 23 February 2016 06:36:55AM 1 point [-]

I really like this distinction. The closest I've seen is discussion of existential risk from a non-anthropocentric perspective. I suppose the neologism would be panexistential risk.

Comment author: G0W51 06 March 2016 04:25:24PM 0 points [-]

Panexistential risk is a good, intuitive, name.

Comment author: polymathwannabe 22 February 2016 01:44:14PM 0 points [-]

a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential

I think the term is Great Filter.

Comment author: philh 23 February 2016 01:56:10AM 0 points [-]

G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn't endanger anyone else in the universe. Whereas if Earth creates UFAI, that's bad for everyone in our light cone.

Comment author: G0W51 23 February 2016 05:32:12AM 0 points [-]

True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.