You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

CarlShulman comments on UFAI cannot be the Great Filter - Less Wrong Discussion

35 Post author: Thrasymachus 22 December 2012 11:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (90)

You are viewing a single comment's thread. Show more comments above.

Comment author: CarlShulman 22 December 2012 04:38:44PM *  3 points [-]

Robin's use of the Great Filter argument relies on the SIA, which (if one buys it) allows one to rule out a priori the possibility that the development of beings like us is very rare. Absent that, if one's prior for the development of life is flatter than for things like nuclear war (it would be much less surprising for less than one in 10^100 planets to evolve intelligent life than for less than 1 in 10^100 civilizations like ours to avoid self-destruction with advanced technology) then you get much less update in favor of future filters.

OTOH, the SIA also strongly supports the possibility that we're a simulation (if we assign a 1 in 1 million probability to sims being billions of times more numerous, than we should assign more credence to that than to being in the basement), which warps the Great Filter argument into something almost unrecognizable. See this paper for a discussion of the interactions with SIA.