Bugmaster comments on So You Want to Save the World - Less Wrong

41 Post author: lukeprog 01 January 2012 07:39AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (146)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bugmaster 07 January 2012 03:31:56AM 1 point [-]

Amplified human intelligence is no match for recursively self-improved AI, which is inevitable if science continues.

Just to clarify, when you say "recursively self-improved", do you also imply something like "unbounded" or "with an unimaginably high upper-bound" ? If the AI managed to self-improve itself to, say, regular human genius level and then stopped, then it wouldn't really be that big of a deal.

Comment author: lukeprog 07 January 2012 04:11:50AM 0 points [-]

Right; with a high upper bound. There is plenty of room above us.