You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

niceguyanon comments on Yet More "Stupid" Questions - Less Wrong Discussion

4 Post author: NancyLebovitz 08 September 2013 02:18PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (265)

You are viewing a single comment's thread. Show more comments above.

Comment author: niceguyanon 08 September 2013 10:35:35PM *  5 points [-]

The closest thing that we have in real life to the 'rational agent' concept in game theory and artificial intelligence are psychopaths. Taking this idea further, it's easy to see why a rational superintelligence would become a UFAI - it is a psychopath.

This doesn't quite seem right and here is why; my utility function considers others people's utility function, therefore by acting rationally and maximizing my utility function leaves room for empathy of others. You only get psychopathy if the utility function of the rational agent, is psychopathic, most people's utility functions are not.

Comment author: passive_fist 08 September 2013 10:59:55PM 0 points [-]

This doesn't quite seem right and here is why; my utility function considers others people's utility function, therefore by acting rationally and maximizing my utility function leaves room for empathy of others.

Yes this is what I said. Usually in game theory setups though, empathy is not included in the utility function. That's what I meant, sorry if it was unclear. You're right that an agent can be rational and empathic at the same time.

Comment author: Luke_A_Somers 09 September 2013 10:55:29AM 4 points [-]

Then the game theory experiments leaving no room for empathy are straw Vulcans.