Observation: Humans seem to actually care about each other. How was culture + evolution able to achieve this?
While not to 100% (Maybe 50-95%), I would trust most humans to not push a big red button that kills all humans, even if it had some large benefit to the experience of this human in this weird hypothetical. Does this mean that most humans are aligned? Is there a reason to believe intellectually amplified humans would not be aligned in this way? How did evolution manage to encode this value of altruism, and does the possibility of this imply, we should also be able to do it with AI?
This seems like the sort of confused question that has been asked before. If so, I'd be glad about the link.
I think I might have started from a more pessimistic standpoint? It's more like, I could also imagine living in a world where humans cooperate, but not because they actually care about each other, but would just pretend to do so? Introspection tells me that does not apply to myself, though maybe I evolved to not be conscious of my own selfishness? I am even less sure how altruistic other people are, because I did not ask lots of people: "Would you press a button that annihilates everyone after your death, if in return you get an awesome life?". On the other hand, cooperation would probably be hard for us in such a world, so this is not as surprising?