shokwave comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: shokwave 01 November 2010 07:25:46AM 4 points [-]

The scale is all out.

earthworm --three orders of magnitude--> small lizard --three orders of magnitude--> dog --three orders of magnitude--> human --thirty orders of magnitude--> weakly superhuman AGI --several thousand orders of magnitude--> strong AI

If a recursively self-improving process stopped just far enough above us to consider us pets and did so, I would seriously question whether it was genuinely recursive, or if it was just gains from debugging and streamlining human thought process. ie, I could see a self-modifying transhuman acting in the manner you describe. But not an artificial intelligence, not unless it was very carefully designed.