DanielLC comments on Open Thread, May 19 - 25, 2014 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (289)
Asking "Would an AI experience emotions?" is akin to asking "Would a robot have toenails?"
There is little functional reason for either of them to have those, but they would if someone designed them that way.
Edit: the background for this comment - I'm frustrated by the way AI is represented in (non-rationalist) fiction.
Define "emotion".
I find it highly unlikely robots would have anything corresponding to any given human emotion, but if you just look at the general area in thingspace that emotions are in, and you're perfectly okay with the idea of finding a new one, then it would be perfectly reasonable for robots to have emotions. For one thing, general negative and positive emotions would be pretty important for learning.