philh comments on Open Thread Feb 16 - Feb 23, 2016 - Less Wrong

5 Post author: Elo 15 February 2016 02:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (104)

You are viewing a single comment's thread. Show more comments above.

Comment author: polymathwannabe 22 February 2016 01:44:14PM 0 points [-]

a generalization of existential risk that includes the extinction of alien intelligences or the drastic decrease of their potential

I think the term is Great Filter.

Comment author: philh 23 February 2016 01:56:10AM 0 points [-]

G0W51 is talking about universal x-risk versus local x-risk. Global thermonuclear war would be relevant for the great filter, but doesn't endanger anyone else in the universe. Whereas if Earth creates UFAI, that's bad for everyone in our light cone.

Comment author: G0W51 23 February 2016 05:32:12AM 0 points [-]

True. Also, the Great Filter is more akin to an existential catastrophe than to existential risk, that is, the risk of an existential catastrophe.