You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

lukeprog comments on AGI Quotes - Less Wrong Discussion

6 Post author: lukeprog 02 November 2011 08:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread.

Comment author: lukeprog 02 November 2011 08:46:50AM *  3 points [-]

The survival of man may depend on the early construction of an ultraintelligent machine — or the ultraintelligent machine may take over and render the human race redundant or develop another form of life. The prospect that a merely intelligent man could ever attempt to predict the impact of an ultraintelligent device is of course unlikely but the temptation to speculate seems irresistible.

Julius Lukasiewicz (1974)

Comment author: [deleted] 02 November 2011 11:29:28PM 4 points [-]

You a have a lot of quotes to share.