nazgulnarsil comments on Greg Egan and the Incomprehensible - Less Wrong

16 Post author: XiXiDu 19 May 2011 10:38AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (23)

You are viewing a single comment's thread.

Comment author: nazgulnarsil 19 May 2011 10:55:05AM 2 points [-]
Comment author: MixedNuts 19 May 2011 11:25:00AM 5 points [-]

No - the aliens are slower (think slowly, progress slowly because they're stupider), but can understand as much as humans given enough time. This is the whole point!

Comment author: nazgulnarsil 19 May 2011 11:30:27AM *  3 points [-]

I don't follow what you're trying to communicate. The story isn't about aliens and humans, it's about an AI in a box.

The point is that aliens or an AI don't need to be qualitatively different to be incomprehensible. One Einstein is incomprehensible to most people at 1x human speed. thousands of einsteins at 1000x speed would be.....

Comment author: MixedNuts 19 May 2011 11:35:57AM *  1 point [-]

Edit: Turns out I misunderstood Greg Egan, and probably Eliezer Yudkowsky. What I thought was Egan's position is Aaronson's unless I misunderstood him too.

Paraphrase of Greg Egan's position (if I and XiXiDu understand correctly): "Given enough time, humans can understand anything. In practice we still get squashed by AIs, since they're much faster, but slow them down and we're equals."

Paraphrase of Eliezer Yudkowsky's position (same disclaimer): "There are things that humans simply cannot understand, ever, no matter how long it takes, but that other minds can understand." (I'm not sure what happens if you brute-force insightspace.)

Comment author: nazgulnarsil 19 May 2011 11:45:10AM 5 points [-]

arguments about the human mindspace in toto are silly at this juncture in our understanding.

Comment author: Will_Newsome 19 May 2011 12:06:43PM 4 points [-]

I think that your impressions are at least implicitly inaccurate, unless your quote marks are actually indicating quotes I haven't seen. (If not, perhaps you should paraphrase in a way that doesn't look like direct quoting?) Greg Egan thinks that AIs are not a problem even considering (and dismissing as impossible?) their speed advantage, as far as I can tell. So, practically speaking, he thinks this uFAI alarmism is wrong and maybe contemptible, again as far as I can tell. Eliezer's impression might be that there are things humans can never understand, but if so that's probably because the word 'human' typically refers to a structure that is defined in many ways by its boundedness. That is, maybe a human could follow a superintelligent argument if the human was upgraded with a Jupiter brain, but calling such a human a human might be stretching definitions. But maybe Eliezer does in fact have deeper objections, I'm not sure.

Comment author: Risto_Saarelma 19 May 2011 12:18:23PM 0 points [-]

I don't see anything in the story which I'd expect Egan to disagree with, so I'm not quite sure how it's relevant here.

Comment author: nazgulnarsil 19 May 2011 01:33:45PM 2 points [-]

OP asks what does it mean for something to be incomprehensible. My point was that we don't need to resort to mysterious, non-answerable hypotheticals about rifts in mind space to answer the question.