I appear today on Bloggingheads.tv, in "Science Saturday: Singularity Edition", speaking with John Horgan about the Singularity. I talked too much. This episode needed to be around two hours longer.
One question I fumbled at 62:30 was "What's the strongest opposition you've seen to Singularity ideas?" The basic problem is that nearly everyone who attacks the Singularity is either completely unacquainted with the existing thinking, or they attack Kurzweil, and in any case it's more a collection of disconnected broadsides (often mostly ad hominem) than a coherent criticism. There's no equivalent in Singularity studies of Richard Jones's critique of nanotechnology - which I don't agree with, but at least Jones has read Drexler. People who don't buy the Singularity don't put in the time and hard work to criticize it properly.
What I should have done, though, was interpreted the question more charitably as "What's the strongest opposition to strong AI or transhumanism?" in which case there's Sir Roger Penrose, Jaron Lanier, Leon Kass, and many others. None of these are good arguments - or I would have to accept them! - but at least they are painstakingly crafted arguments, and something like organized opposition.
Hopefully anonymous: There are strong warnings against posting too much, but my personal suspicion is that the next generation of AI will not colonize other planets, convert stars, or any of the things we see as huge and important, but go in the opposite direction and become smaller and smaller. At least, should the thing decide that survival is ethical and desirable.
But as sand or worms or simply irrelevant, the result is the same. We shouldn't be worried that our children consume us: it's the nature of life, and that will continue even with the next super intelligent beings. To evolve, everything must die or be rendered insignificant, and there is no escape from death even for stagnant species. I think that will hold true for many generations.