I guess there’s maybe a 10-20% chance of AI causing human extinction in the coming decades, but I feel more distressed about it than even that suggests—I think because in the case where it doesn’t cause human extinction, I find it hard to imagine life not going kind of off the rails. So many things I like about the world seem likely to be over or badly disrupted with superhuman AI (writing, explaining things to people, friendships where you can be of any use to one another, taking pride in skills, thinking, learning, figuring out how to achieve things, making things, easy tracking of what is and isn’t conscious), and I don’t trust that the replacements will be actually good, or good for us, or that anything will be reversible.
Even if we don’t die, it still feels like everything is coming to an end.
Did you miss transhumanism? If it's truly important to you, to be useful, alignment would mean that superintelligence will find a way to lift you up and give you a role.
I suppose there might be a period during which we've figured out existential security but the FASI hasn't figured out human augmentation beyond the high priority stuff like curing aging. I wouldn't expect that period to be long.