You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Stuart_Armstrong comments on An example of deadly non-general AI - Less Wrong Discussion

13 Post author: Stuart_Armstrong 21 August 2014 02:15PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (25)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stuart_Armstrong 21 August 2014 02:43:15PM 1 point [-]

who promptly changes the criteria for success and tries again.

Until they stumble upon an AI that lies, possibly inadvertently, and then we're dead...

But I do agree that general intelligence is more dangerous, it's just that narrow intelligence isn't harmless.

Comment author: [deleted] 21 August 2014 06:07:41PM 1 point [-]

How do you convincingly lie without having the capability to think up a convincing lie?

Comment author: Stuart_Armstrong 22 August 2014 10:05:44AM 2 points [-]

Every statement an AI tells us will be a lie to some extent, simply in terms of being a simplification so that we can understand it. If we end up selecting against simplifications that reveal nefarious plans...

But the narrow AI I had above might not even be capable of lying - it might just simply spit out the drug design, with a list of estimated improvements according to the criteria it's been given, without anyone ever realising that "reduced mortality" was code for "everyone's dead already".

Comment author: Luke_A_Somers 22 August 2014 11:23:40AM 1 point [-]

Every statement an AI tells us will be a lie to some extent, simply in terms of being a simplification so that we can understand it.

Not so. You can definitely ask questions about complicated things that have simple answers.

Comment author: Stuart_Armstrong 22 August 2014 12:08:24PM 2 points [-]

Yes, that was an exaggeration - I was thinking of most real-world questions.

Comment author: Luke_A_Somers 22 August 2014 06:51:53PM *  3 points [-]

I was thinking of most real-world questions that aren't of the form 'Why X?' or 'How do I X?'.

"How much/many X?" -> number

"When will X?" -> number

"Is X?" -> boolean

"What are the chances of X if I Y?" -> number

Also, any answer that simplifies isn't a lie if its simplified status is made clear.

Comment author: VAuroch 22 August 2014 12:41:10AM 2 points [-]

Think you're telling the truth.

Comment author: Nornagest 22 August 2014 01:00:44AM *  4 points [-]

Or be telling the truth, but be misinterpreted.