John_Maxwell_IV comments on UFAI cannot be the Great Filter - Less Wrong

35 Post author: Thrasymachus 22 December 2012 11:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (90)

You are viewing a single comment's thread.

Comment author: John_Maxwell_IV 23 December 2012 01:15:58AM *  3 points [-]

Really, it seems like any kind of superintelligent AI, friendly or unfriendly, would result in expanding intelligence throughout the universe. So perhaps a good statement would be: "If you believe the Great Filter is ahead of us, that implies that most civilizations get wiped out before achieving any kind of superintelligent AI, meaning that either superintelligent AI is very hard, or wiping out generally comes relatively early." (It seems possible that we already got lucky with the Cold War... http://www.guardian.co.uk/commentisfree/2012/oct/27/vasili-arkhipov-stopped-nuclear-war)

Comment author: Eliezer_Yudkowsky 23 December 2012 04:48:06AM 6 points [-]

Unless intelligent life is already almost-extremely rare, that's not nearly enough 'luck' to explain why everyone else is dead, including aliens who happen to be better at solving coordination problems (imagine SF insectoid aliens).

Comment author: John_Maxwell_IV 23 December 2012 05:52:22AM 1 point [-]

Yeah, of course.