timtyler comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 20 November 2010 12:58:29PM *  1 point [-]

...and here's a quote from I.J. Good, from 1965:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

He didn't coin the term "Seed AI" either.

Comment author: XiXiDu 20 November 2010 01:01:37PM *  0 points [-]

Yes, but I believe it is a bit weird for a Wikipedia article to state that someone is the originator of the Seed AI theory when he just coined the term. I wasn't disputing anything, just trying to figure out if it is actually the case that Yudkowsky came up with the concept in the first place.

Comment author: timtyler 20 November 2010 01:15:56PM *  -1 points [-]

Not the concept - the term.

"Seed AI theory" probably refers to something or another in here - which did indeed originate with Yu'El.

Presumably http://en.wikipedia.org/wiki/Seed_AI should be considered to be largely SIAI marketing material.