Let's assume computationalism and the feasibility of brain scanning and mind upload. And let's suppose one is a person with a large compute budget.
In this post I'll attempt to answer these quetions: How should one spend one's compute budget? How many uploads of oneself should one create? Should one terminate one's biological self? What will one's uploaded existence be like?
First, let's establish the correct frame in which to explore the questions relating to the act of upload. One is considering creating copies of oneself. So, what happens, subjectively, when one spins up a new copy of oneself? The copy is computationally identical to the original, and consciousness is computation, so each is equally oneself. But one is not both. This means that when one is creating a copy one can treat it as a gamble: there's a 50% chance they find themselves in each of the continuations.
What matters to one is then the average quality of one's continuations. One does not benefit from the creation of many great continuations over just one great continuation, because one can never experience more than one continuation.
Therefore, one should spend all of one's resources on creating the single best possible continuation. And one should terminate one's biological self: the real world is strictly worse than the ASI-curated personal utopia one's upload will experience, so terminating the biological self increases the average quality of one's continuations. To make this point clear: if one did not terminate the biological self, one would subjectively have a 50% chance of ending up in the less-favourable biological psychological continuation.
But we can figure out more about this continuation. One wants to make sure one is spending one's compute selfishly (to whatever extent one is altruistic, one should allocate a proportionate amount of one's compute budget to ASI). But value drift is a danger here: one's life expectancy is going to be absurdly high, maybe undecillions of years of personal utopia, so by default one's beliefs will evolve, and one's pre-ASI memories will be forgotten... every aspect of oneself will evolve.
Before even a small fraction of one's life has played out, one's copy will bear no relation to oneself. To spend one's compute on this person, effectively a stranger, is just altruism. One would be better off donating the compute to ASI.
So, one should get the ASI to alter oneself so that one has a more capacious memory, a more rigid personality, place one in a world more deeply-rooted in one’s pre-ASI history than would strictly maximise value. This way, one's identity can be retained.
What other questions about the person's post-ASI life can we answer? How about what this utopia one's upload will inhabit is like? Well, let's again establish a frame from which to explore our question. The world will be made by advanced ASI to be the best-possible world for one.
The ASI curator of one's world won't want one's experience to be held back by moral considerations for the other inhabitants of one's world. If it was obligated to create a perfect world in which everyone was a standard, human moral patient it would run into the usual paradoxes of utopia. I am sure that ASI could resolve these paradoxes, but the result would not be the best-possible world for one. So, the other inhabitants in one's world will be non-sentient.
However, the awareness of this fact might cheapen one's experience, so the ASI curator would ensure that one was not aware of the non-sentience of the other inhabitants of one's world.
To get some indication of what this world will be like, one can think of a superlative version of one's pre-ASI life. The stakes of the world will be higher, the emotional connections one makes will be stronger, and one will accomplish greater things. One will experience love, passion, etc. much more fully. So, I imagine one's world as being a place of vast galactic empires, inexhaustible lore, immense beauty. And that one's life will be full of grand struggle and triumph.
I would not want to be terminated, speaking as the biological continuation of existence.