Eliezer_Yudkowsky comments on Reply to Holden on 'Tool AI' - Less Wrong

94 Post author: Eliezer_Yudkowsky 12 June 2012 06:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (348)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 12 June 2012 06:11:46PM 8 points [-]

I wouldn't endorse their significance the same way, and would stand by my statement that although the AGI field as a whole has perceptible risk, no individual project that I know of has perceptible risk. Shane and Demis are cool, but they ain't that cool.

Comment author: lukeprog 12 June 2012 06:21:11PM 8 points [-]

Right. I should have clarified that by "one of the most significant AGI projects I know of" I meant "has a very tiny probability of FOOMing in the next 15 years, which is greater than the totally negligible probability of FOOMing in the next 15 years posed by Juergen Schmidhuber."