I guess there’s maybe a 10-20% chance of AI causing human extinction in the coming decades, but I feel more distressed about it than even that suggests—I think because in the case where it doesn’t cause human extinction, I find it hard to imagine life not going kind of off the rails. So many things I like about the world seem likely to be over or badly disrupted with superhuman AI (writing, explaining things to people, friendships where you can be of any use to one another, taking pride in skills, thinking, learning, figuring out how to achieve things, making things, easy tracking of what is and isn’t conscious), and I don’t trust that the replacements will be actually good, or good for us, or that anything will be reversible.
Even if we don’t die, it still feels like everything is coming to an end.
How distressed would you be if the "good ending" were opt-in and existed somewhere far away from you? I've explored the future and have found one version that I think would satisfy your desire but I'm asking to get your perspective. Does it matter whether there are super-intelligent AIs but they leave our existing civilization alone and create a new one out on the fringes (the Artic, Antarctica or just out in space) and invite any humans to come along to join them without coercion? If you need more details, they're available at the Opt-In Revolution, in narrative form.