muflax comments on The curse of identity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (296)
I don't understand why you call this a problem. If I understand you correctly, you are proposing that people constantly and strongly optimize to obtain signalling advantages. They do so without becoming directly aware of it, which further increases their efficiency. So we have a situation where people want something and choose an efficient way to get it. Isn't that good?
More directly, I'm confused how you can look at an organism, see that it uses its optimization power in a goal-oriented and efficient way (status gains in this case) and call that problematic, merely because some of these organisms disagree that this is their actual goal. What would you want them to do - be honest and thus handicap their status seeking?
Say you play many games of Diplomacy against an AI, and the AI often promised you to be loyal, but backstabbed you many times to its advantage. You look at the AI's source code and find out that it has backstabbing as a major goal, but the part that talks to people isn't aware of that so that it can lie better. Would you say that the AI is faulty? That it is wrong and should make the talking module aware of its goals, even though this causes it to make more mistakes and thus lose more? If not, why do you think humans are broken?
Yes. It might be doing exactly what it was designed to do, but its designer was clearly stupid or cruel and had different goals than I'd prefer the AI to have.
Extrapolate this to humans. Humans wouldn't care so much about status if it weren't for flaws like scope insensitivity, self-serving bias, etc., as well as simply poor design "goals".
Where are you getting your goals from? What are you, except your design? You are what Azathoth build. There is no ideal you that you should've become, but which Azathoth failed to make.
Azathoth designed me with conflicting goals. Subconsciously, I value status, but if I were to take a pill that made me care entirely about making the world better and nothing else, I would. Just because "evolution" built that into me doesn't make it bad, but it definitely did not give me a coherent volition. I have determined for my self which parts of humanity's design are counterproductive, based on the thousand shards of desire.
Would you sign up to be tortured so that others don't suffer dust specks?
("If we are here to make others happy, what are the others here for?")
A better analogy would be asking about a pill that caused pain asymbolia.
Yes.