Ah ok. I was assuming that if a singularity occurred it'd be beyond our control, and that our fate would be determined by how the AI was originally programmed. But my reason for assuming this is based on much limited information, so I don't really know. If it were the case that people with political power control AI, then I think that you are very right.
But if you're right and we live in a society where there is ASI level power that is controlled by people with political power... that really really scares me. My intuition is that it'd be just a matter of time before someone screws up. I'm not sure what to think of this...
Our beliefs aren't just cargo that we carry around. They become part of our personal identity, so much so that we feel hurt if we see someone attacking our beliefs, even if the attacker isn't speaking to us individually. These "beliefs" are not necessarily grand things like moral frameworks and political doctrines, but can also be as inconsequential as an opinion about a song.
This post is for discussing times when you actually changed your mind about something, detaching from the belief that had wrapped itself around you.
Relevant reading: The Importance of Saying "Oops", Making Beliefs Pay Rent