shokwave comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
The scale is all out.
earthworm --three orders of magnitude--> small lizard --three orders of magnitude--> dog --three orders of magnitude--> human --thirty orders of magnitude--> weakly superhuman AGI --several thousand orders of magnitude--> strong AI
If a recursively self-improving process stopped just far enough above us to consider us pets and did so, I would seriously question whether it was genuinely recursive, or if it was just gains from debugging and streamlining human thought process. ie, I could see a self-modifying transhuman acting in the manner you describe. But not an artificial intelligence, not unless it was very carefully designed.