You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

hankx7787 comments on AGI Quotes - Less Wrong Discussion

6 Post author: lukeprog 02 November 2011 08:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread.

Comment author: hankx7787 04 November 2011 11:49:32AM 14 points [-]

"Sorry Arthur, but I'd guess that there is an implicit rule about announcement of an AI-driven singularity: the announcement must come from the AI, not the programmer. I personally would expect the announcement in some unmistakable form such as a message in letters of fire written on the face of the moon." - Dan Clemmensen, SL4

Comment author: ciphergoth 16 November 2011 12:37:53PM 5 points [-]
Comment author: MichaelAnissimov 18 November 2011 08:04:22PM 1 point [-]

This is one of the earliest quotes I read that made it click that nothing I could do with my life would have greater impact than pursuing superintelligence.