It’s interesting to me how chill people sometimes are about the non-extinction future AI scenarios. Like, there seem to be opinions around along the lines of “pshaw, it might ruin your little sources of ‘meaning’, Luddite, but we have always had change and as long as the machines are pretty near the mark on rewiring your brain it will make everything amazing”. Yet I would bet that even that person, if faced instead with a policy that was going to forcibly relocate them to New York City, would be quite indignant, and want a lot of guarantees about the preservation of various very specific things they care about in life, and not be just like “oh sure, NYC has higher GDP/capita than my current city, sounds good”.
I read this as a lack of engaging with the situation as real. But possibly my sense that a non-negligible number of people have this flavor of position is wrong.
Pretty much, yes. Total loss of power and value is pretty much slow/delayed extinction. It's certainly cultural extinction.
Note that I forgot to say that I put some weight/comfort in thinking there are some parts of mindspace which an AI could include, which are nearly as good (or maybe better) than biologicals. Once everyone I know and everyone THEY know are dead, and anything I recognize as virtues are mutated beyond my recognition, it's not clear what preferences I would have about the ongoing civilization. Maybe extinction is an acceptible outcome.