It’s interesting to me how chill people sometimes are about the non-extinction future AI scenarios. Like, there seem to be opinions around along the lines of “pshaw, it might ruin your little sources of ‘meaning’, Luddite, but we have always had change and as long as the machines are pretty near the mark on rewiring your brain it will make everything amazing”. Yet I would bet that even that person, if faced instead with a policy that was going to forcibly relocate them to New York City, would be quite indignant, and want a lot of guarantees about the preservation of various very specific things they care about in life, and not be just like “oh sure, NYC has higher GDP/capita than my current city, sounds good”.
I read this as a lack of engaging with the situation as real. But possibly my sense that a non-negligible number of people have this flavor of position is wrong.
What does "value" mean here? I seriously don't know what you mean by "total loss of value". Is this tied to your use of "economically important"?
I personally don't give a damn for anybody else depending on me as the source of anything they value, at least not with respect to anything that's traditionally spoken of as "economic". In fact I would prefer that they could get whatever they wanted without involving me, and i could get whatever I wanted without involving them.
And power over what? Most people right this minute have no significant power over the wide-scale course of anything.
I thought "extinction", whether for a species or a culture, had a pretty clear meaning: It doesn't exist any more. I can't see how that's connected to anything you're talking about.
I do agree with you about human extinction not necessarily being the end of the world, depending on how it happens and what comes afterwards... but I can't see how loss of control, or value, or whatever, is connected to anything that fits the word "extinction". Not physical, not cultural, not any kind.