Will_Newsome comments on Greg Egan and the Incomprehensible - Less Wrong

16 Post author: XiXiDu 19 May 2011 10:38AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (23)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 19 May 2011 12:06:43PM 4 points [-]

I think that your impressions are at least implicitly inaccurate, unless your quote marks are actually indicating quotes I haven't seen. (If not, perhaps you should paraphrase in a way that doesn't look like direct quoting?) Greg Egan thinks that AIs are not a problem even considering (and dismissing as impossible?) their speed advantage, as far as I can tell. So, practically speaking, he thinks this uFAI alarmism is wrong and maybe contemptible, again as far as I can tell. Eliezer's impression might be that there are things humans can never understand, but if so that's probably because the word 'human' typically refers to a structure that is defined in many ways by its boundedness. That is, maybe a human could follow a superintelligent argument if the human was upgraded with a Jupiter brain, but calling such a human a human might be stretching definitions. But maybe Eliezer does in fact have deeper objections, I'm not sure.