It’s interesting to me how chill people sometimes are about the non-extinction future AI scenarios. Like, there seem to be opinions around along the lines of “pshaw, it might ruin your little sources of ‘meaning’, Luddite, but we have always had change and as long as the machines are pretty near the mark on rewiring your brain it will make everything amazing”. Yet I would bet that even that person, if faced instead with a policy that was going to forcibly relocate them to New York City, would be quite indignant, and want a lot of guarantees about the preservation of various very specific things they care about in life, and not be just like “oh sure, NYC has higher GDP/capita than my current city, sounds good”.
I read this as a lack of engaging with the situation as real. But possibly my sense that a non-negligible number of people have this flavor of position is wrong.
I think the point by the OP is that while YOU might think NYC is a great place, not everybody does. One of the nice things about the current model is that you can move to NYC if you want to, but you don't have to. In the hypothetical All-AGI All Around The World future, you get moved there whether or not you like it. Some people will, but it's worth thinking about the people who won't like it and consider what you might do to make that future better for them as well.