DanArmak comments on David Chalmers' "The Singularity: A Philosophical Analysis" - Less Wrong

33 Post author: lukeprog 29 January 2011 02:52AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (202)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanArmak 29 January 2011 04:17:08PM 6 points [-]

If the paperclipper happens to be the first AI++, and arrives before humanity goes interstellar, then it can probably wipe out all humanity quite quickly without reasoning with it. And if can do that it definitely will - no point in compromising when you've got the upper hand.

Comment author: wedrifid 29 January 2011 04:34:17PM 5 points [-]

no point in compromising when you've got the upper hand.

Well, at least not when the lower hand is more use disassembled to build more cosmic commons burning spore ships.