Eliezer_Yudkowsky comments on Above-Average AI Scientists - Less Wrong

21 Post author: Eliezer_Yudkowsky 28 September 2008 11:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (96)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Eliezer_Yudkowsky 30 September 2008 03:11:00AM 7 points [-]

Scott, if the question you're asking is "Can they learn something by doing this?" and not "Can they build AGI?" or "Can they build FAI?" a whole different standard applies. You can also learn something by trying to take apart an alarm clock.

Much of life consists of holding yourself to a high enough standard that you actually make an effort. If you're going to learn, just learn - get a textbook, try problems at the appropriate difficult-but-not-too-hard level. If you're going to set out to accomplish something, don't bait-and-switch to the "Oh, but I'll learn something even if I fail" when it looks like you might fail. Yoda was right: If you're going to do something, set out to do it, don't set out to try.