ciphergoth comments on The curse of identity - Less Wrong

121 Post author: Kaj_Sotala 17 November 2011 07:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (296)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 17 November 2011 01:48:17PM *  2 points [-]

I don't understand why you call this a problem. If I understand you correctly, you are proposing that people constantly and strongly optimize to obtain signalling advantages. They do so without becoming directly aware of it, which further increases their efficiency. So we have a situation where people want something and choose an efficient way to get it. Isn't that good?

More directly, I'm confused how you can look at an organism, see that it uses its optimization power in a goal-oriented and efficient way (status gains in this case) and call that problematic, merely because some of these organisms disagree that this is their actual goal. What would you want them to do - be honest and thus handicap their status seeking?

Say you play many games of Diplomacy against an AI, and the AI often promised you to be loyal, but backstabbed you many times to its advantage. You look at the AI's source code and find out that it has backstabbing as a major goal, but the part that talks to people isn't aware of that so that it can lie better. Would you say that the AI is faulty? That it is wrong and should make the talking module aware of its goals, even though this causes it to make more mistakes and thus lose more? If not, why do you think humans are broken?

Comment author: ciphergoth 17 November 2011 02:15:05PM 4 points [-]

It's a problem from the point of view of that part of me that actually wants to achieve large scale strategic goals.

Comment author: [deleted] 17 November 2011 02:25:08PM 2 points [-]

Honest question: how do you know you have these goals? Presumably they don't manifest in actual behavior, or you wouldn't have a problem. If Kaj's analysis is right, shouldn't you assume that the belief of having these goals is part of your (working) strategy to gain certain status? Would you accept the same argument if Bruce made it?

Comment author: ciphergoth 17 November 2011 02:28:00PM 10 points [-]

Put it this way, if there was a pill that I believed would cause me to effectively have that goal, in a way that was compatible with a livable life, I would take it.