Robot Programmed To Love Goes Too Far (link)

-5 Post author: Alexei 28 April 2012 01:21AM

http://www.muckflash.com/?p=200

Might be a nice story to point out to people who think "friendly" is easy.

 

Comments (11)

Comment author: Rain 28 April 2012 02:00:40AM *  4 points [-]

Maybe it was based on Hugbot.

Comment author: shminux 28 April 2012 01:28:47AM *  -1 points [-]

An aspired rationalist ought to be less gullible and able to spot a hoax like that. Did you feel any sense of confusion when reading it and looking at the picture?

Comment author: Raemon 28 April 2012 03:06:20AM 1 point [-]

I first read that article a while ago, and was taken in until I read some other articles on the site. It does sound plausible to me that such a robot might be built someday and go wrong in a similar fashion.

I'm neither upvoting nor downvoting the OP because I think the article is interesting a) as a test of rationality/skepticism, and b) as a simple expression of Friendly AI concepts that are worth discussing, as a thought experiment at least.

Comment author: pedanterrific 28 April 2012 03:12:45AM 0 points [-]

Since it's not made obvious in the title or body of the post, maybe this comment would benefit from rot13? Just a suggestion.

Comment author: shminux 28 April 2012 04:04:32AM 0 points [-]

What would the the reason for obfuscating?

Comment author: pedanterrific 28 April 2012 04:07:26AM 1 point [-]

I read your comment before I clicked on the link, so I didn't get the chance to spot the hoax for myself.

Comment author: shminux 28 April 2012 05:15:34AM 0 points [-]

Ah, makes sense. Thanks.

Comment author: vi21maobk9vp 28 April 2012 07:07:11AM 1 point [-]

What about actually applying rot13? Here, this is what you need to copy-paste:

Na nfcvevat engvbanyvfg bhtug gb or yrff thyyvoyr naq noyr gb fcbg n ubnk yvxr gung. Qvq lbh srry nal frafr bs pbashfvba jura ernqvat vg naq ybbxvat ng gur cvpgher?