Randaly comments on Wanted: backup plans for "seed AI turns out to be easy" - Less Wrong

18 Post author: Wei_Dai 28 September 2011 09:54PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (62)

You are viewing a single comment's thread.

Comment author: Randaly 29 September 2011 01:05:32AM 0 points [-]

Put in place some stopgap safety measures (the Three laws, regular check-ins by human controllers, prevent it from FOOMing too large, etc), then tell the AI to upload one or more humans. Shut down the AI, have the uploaded humans FOOM safely, then upload the rest of humanity.

Comment author: FeepingCreature 29 September 2011 09:14:38AM 3 points [-]

The Three Laws are most decidedly not safe, and in fact, should be discarded and discredited. The first law in specific, "do not allow through inaction a human to come to harm", can be trivially interpreted in various bad-end ways. Read The Metamorphosis of Prime Intellect for a fictional sample.

Comment author: Eneasz 29 September 2011 07:24:26PM 10 points [-]

I never read the original source, but wasn't the very story that introduced the Three Laws an exercise in discrediting those laws? If so, how the heck does everyone keep coming to the opposite conclusion? It seems similar to using 1984 as example of why we should have ubiquitous surveillance.

Comment author: FeepingCreature 29 September 2011 08:58:16PM 4 points [-]

Talk about Streisand Effect.

Comment author: lessdazed 29 September 2011 10:53:21PM 1 point [-]

Not just the original story, but literally hundreds of other stories went on to make the same point - the three laws fail in hundreds of unique ways, depending upon the situation.

Comment author: ArisKatsaris 29 September 2011 10:56:09PM 2 points [-]

But really in the universe Asimov was portraying, these were still mostly the exceptions, and the vast majority of robots were safe because of the Three Laws. So his stories weren't really "discrediting those laws" at all.

Comment author: lessdazed 29 September 2011 11:15:16PM 3 points [-]

In multiple cases it was the newly advanced one that was different in kind than others. Toasters work fine under the three laws, even in Terminator the humans are shown with obedient guns and didn't insist on fighting bare-handed.

In other cases, the robot was the same model as well behaved ones, and it had an error making it conscious, or something like that.

You're right that the stories can't all be characterized the way I characterized them. There was a lot of variety, he made a career of them and didn't do it by writing the same story again and again.