Eliezer_Yudkowsky comments on Sayeth the Girl - Less Wrong

47 Post author: Alicorn 19 July 2009 10:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (486)

You are viewing a single comment's thread. Show more comments above.

Comment author: SoullessAutomaton 19 July 2009 11:59:45PM 16 points [-]

The more significant issue is the lack of respect for autonomy and the other individual's goals. It is, shall we say, "unFriendly".

It's perfectly possible to have excellent models of other people's psyches but no respect for their autonomy; in fact it's a useful skill in sales and marketing. In the pathological extreme, it's popularly called "sociopathy".

Comment author: Eliezer_Yudkowsky 20 July 2009 01:44:45AM 7 points [-]

"UnFriendly" is supposed to be a technical term covering a tremendous range of AIs. What do you mean by it in this context? Flawed fun theory? Disregard for volition?

Comment author: SoullessAutomaton 20 July 2009 02:51:29AM 8 points [-]

In this specific case, the disregard for volition. In the more general sense, stretching the term by analogy to describe any behavior from an agent with a significant power advantage that wouldn't be called "Friendly" if done by an AI with a power advantage over humans.

The implicit step here, I think, is that whatever value system an FAI would have would also make a pretty good value system for any agent in a position of power, allowing for limitations of cognitive potential.

Comment author: JulianMorrison 20 July 2009 01:53:18AM -2 points [-]

Mostly disregard for volition, but also satisficing too early on fun.