Gray_Area comments on Science as Attire - Less Wrong

48 Post author: Eliezer_Yudkowsky 23 August 2007 05:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (84)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Gray_Area 23 August 2007 10:35:12AM 4 points [-]

Eliezer said: "I encounter people who are quite willing to entertain the notion of dumber-than-human Artificial Intelligence, or even mildly smarter-than-human Artificial Intelligence. Introduce the notion of strongly superhuman Artificial Intelligence, and they'll suddenly decide it's "pseudoscience"."

It may be that the notion of strongly superhuman AI runs into people's preconceptions they aren't willing to give up (possibly of religious origins). But I wonder if the 'Singularians' aren't suffering from a bias of their own. Our current understanding of science and intelligence is compatible with many non-Singularity outcomes:

(a) 'human-level' intelligence is, for various physical reasons, an approximate upper bound on intelligence (b) Scaling past 'human-level' intelligence is possible but difficult due to extremely poor returns (e.g., logarithmic rather than exponential growth past a certain point) (c) Scaling past 'human-level' intelligence is possible, is not difficult, but runs into an inherent 'glass ceiling' far below 'incomprehensibility' of the resulting intelligence

and so on

Many of these scenarios seem as interesting to me as a true Singularity outcome, but my perception is they aren't being given equal time. Singularity is certainly more 'vivid,' but is it more likely?