Come on lets be honest Eliezer got his ego handed to him in that debate, if things don't fit into nice logical categories he doesn't have a clue how to deal with them. The issue here is that most of Eliezer's big arguments that people wait for are based on imagined evidence such as the possibility of superhuman AGI. The belief in which is tantamount to religious fanaticism since the evidence for such does not exist and nobody knows what it will look like. The current arguments until the creation of superhuman AGI are hand waving. Further Eliezer does not have the math background or science background to deal Lanier. You can't hope to build AI with philosophy and some basic bayesian statistics. The issue of AGI is an engineering issue, and unfortunately engineering requires math, and from what I can see having read all of Eliezer's papers online he can't make the cut.
Come on lets be honest Eliezer got his ego handed to him in that debate, if things don't fit into nice logical categories he doesn't have a clue how to deal with them. The issue here is that most of Eliezer's big arguments that people wait for are based on imagined evidence such as the possibility of superhuman AGI. The belief in which is tantamount to religious fanaticism since the evidence for such does not exist and nobody knows what it will look like. The current arguments until the creation of superhuman AGI are hand waving. Further Eliezer does not have the math background or science background to deal Lanier. You can't hope to build AI with philosophy and some basic bayesian statistics. The issue of AGI is an engineering issue, and unfortunately engineering requires math, and from what I can see having read all of Eliezer's papers online he can't make the cut.