Would you say that the AI is faulty?
Yes. It might be doing exactly what it was designed to do, but its designer was clearly stupid or cruel and had different goals than I'd prefer the AI to have.
Extrapolate this to humans. Humans wouldn't care so much about status if it weren't for flaws like scope insensitivity, self-serving bias, etc., as well as simply poor design "goals".
Yes. It might be doing exactly what it was designed to do, but its designer was clearly stupid or cruel and had different goals than I'd prefer the AI to have.
Where are you getting your goals from? What are you, except your design? You are what Azathoth build. There is no ideal you that you should've become, but which Azathoth failed to make.
I have become convinced that problems of this kind are the number one problem humanity has. I'm also pretty sure that most people here, no matter how much they've been reading about signaling, still fail to appreciate the magnitude of the problem.
Here are two major screw-ups and one narrowly averted screw-up that I've been guilty of. See if you can find the pattern.
It may not be immediately obvious, but all three examples have something in common. In each case, I thought I was working for a particular goal (become capable of doing useful Singularity work, advance the cause of a political party, do useful Singularity work). But as soon as I set that goal, my brain automatically and invisibly re-interpreted it as the goal of doing something that gave the impression of doing prestigious work for a cause (spending all my waking time working, being the spokesman of a political party, writing papers or doing something else few others could do). "Prestigious work" could also be translated as "work that really convinces others that you are doing something valuable for a cause".
We run on corrupted hardware: our minds are composed of many modules, and the modules that evolved to make us seem impressive and gather allies are also evolved to subvert the ones holding our conscious beliefs. Even when we believe that we are working on something that may ultimately determine the fate of humanity, our signaling modules may hijack our goals so as to optimize for persuading outsiders that we are working on the goal, instead of optimizing for achieving the goal!
You can see this all the time, everywhere:
There's an additional caveat to be aware of: it is actually possible to fall prey to this problem while purposefully attempting to avoid it. You might realize that you have a tendency to only want to do particularly prestigeful work for a cause... so you decide to only do the least prestigeful work available, in order to prove that you are the kind of person who doesn't care about the prestige of the task! You are still optimizing your actions on the basis of expected prestige and being able to tell yourself and outsiders an impressive story, not on the basis of your marginal impact.