Come on lets be honest Eliezer got his ego handed to him in that debate, if things don't fit into nice logical categories he doesn't have a clue how to deal with them. The issue here is that most of Eliezer's big arguments that people wait for are based on imagined evidence such as the possibility of superhuman AGI. The belief in which is tantamount to religious fanaticism since the evidence for such does not exist and nobody knows what it will look like. The current arguments until the creation of superhuman AGI are hand waving. Further Eliezer does not have the math background or science background to deal Lanier. You can't hope to build AI with philosophy and some basic bayesian statistics. The issue of AGI is an engineering issue, and unfortunately engineering requires math, and from what I can see having read all of Eliezer's papers online he can't make the cut.
My Bloggingheads.tv interview with Jaron Lanier is up. Reductionism, zombies, and questions that you're not allowed to answer:
This ended up being more of me interviewing Lanier than a dialog, I'm afraid. I was a little too reluctant to interrupt. But you at least get a chance to see the probes I use, and Lanier's replies to them.
If there are any BHTV heads out there who read Overcoming Bias and have something they'd like to talk to me about, do let me or our kindly producers know.