JamesAndrix comments on Morality as Fixed Computation - Less Wrong

14 Post author: Eliezer_Yudkowsky 08 August 2008 01:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (45)

Sort By: Old

You are viewing a single comment's thread.

Comment author: JamesAndrix 08 August 2008 03:14:38PM 1 point [-]

If you try to rule out a specific class of ways the AI could modify the programmer, the AI has a motive to superintelligently seek out loopholes and ways to modify the programmer indirectly.

There's a sci-fi story there about an AI that always follows orders manipulating everyone into ordering it to do something horrible.

Of course the humans would realize this at the last minute and stop themselves.

Comment author: David_Gerard 10 February 2011 12:49:27PM -1 points [-]

Were you thinking of All The Troubles Of The World by Isaac Asimov?