edbarbar comments on Bloggingheads: Yudkowsky and Horgan - Less Wrong

4 Post author: Eliezer_Yudkowsky 07 June 2008 10:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (35)

Sort By: Old

You are viewing a single comment's thread.

Comment author: edbarbar 08 June 2008 06:42:50AM 0 points [-]

Hopefully anonymous: There are strong warnings against posting too much, but my personal suspicion is that the next generation of AI will not colonize other planets, convert stars, or any of the things we see as huge and important, but go in the opposite direction and become smaller and smaller. At least, should the thing decide that survival is ethical and desirable.

But as sand or worms or simply irrelevant, the result is the same. We shouldn't be worried that our children consume us: it's the nature of life, and that will continue even with the next super intelligent beings. To evolve, everything must die or be rendered insignificant, and there is no escape from death even for stagnant species. I think that will hold true for many generations.