I appear today on Bloggingheads.tv, in "Science Saturday: Singularity Edition", speaking with John Horgan about the Singularity. I talked too much. This episode needed to be around two hours longer.
One question I fumbled at 62:30 was "What's the strongest opposition you've seen to Singularity ideas?" The basic problem is that nearly everyone who attacks the Singularity is either completely unacquainted with the existing thinking, or they attack Kurzweil, and in any case it's more a collection of disconnected broadsides (often mostly ad hominem) than a coherent criticism. There's no equivalent in Singularity studies of Richard Jones's critique of nanotechnology - which I don't agree with, but at least Jones has read Drexler. People who don't buy the Singularity don't put in the time and hard work to criticize it properly.
What I should have done, though, was interpreted the question more charitably as "What's the strongest opposition to strong AI or transhumanism?" in which case there's Sir Roger Penrose, Jaron Lanier, Leon Kass, and many others. None of these are good arguments - or I would have to accept them! - but at least they are painstakingly crafted arguments, and something like organized opposition.
He wasted 90% of the interview because Yudkowsky discussed how to be rational rather than answering implications of AGI being possible.
How does Yudkowsky's authority change our viewpoint of the feasibility of AGI being developed quickly when most experts clearly disagree? We need to go from the elders being wrong in technique to the path to AGI.
And what about the claim that a billion dollar project isn't needed? Singinst thinks they can do it alone, with a modest budget of a few millionaires? Isn't this a political position?
I am glad Yudkowsky is trying so hard but it seems he is doing more politics and philosophy than research. Perhaps in the long term this will be more effective, as the goal is to win, not to be right.