RobbBB comments on Isolated AI with no chat whatsoever - Less Wrong

14 Post author: ancientcampus 28 January 2013 08:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (61)

You are viewing a single comment's thread. Show more comments above.

Comment author: RobbBB 25 January 2014 10:06:16AM 0 points [-]

If the ASI has nothing better to do while it's boxed, it will pursue low-probability escape scenarios ferociously. One of those is to completely saturate its source code with brain-hacking basilisks in case any human tries to peer inside.

Comment author: Luke_A_Somers 25 January 2014 12:32:41PM 0 points [-]

It would have to do that blind, without a clear model of our minds in place. We'd likely notice failed attempts and just kill it.