Let's assume computationalism and the feasibility of brain scanning and mind upload. And let's suppose one is a person with a large compute budget. 

In this post I'll attempt to answer these quetions: How should one spend one's compute budget? How many uploads of oneself should one create? Should one terminate one's biological self? What will one's uploaded existence be like?

First, let's establish the correct frame in which to explore the questions relating to the act of upload. One is considering creating copies of oneself. So, what happens, subjectively, when one spins up a new copy of oneself? The copy is computationally identical to the original, and consciousness is computation, so each is equally oneself. But one is not both. This means that when one is creating a copy one can treat it as a gamble: there's a 50% chance they find themselves in each of the continuations. 

What matters to one is then the average quality of one's continuations. One does not benefit from the creation of many great continuations over just one great continuation, because one can never experience more than one continuation. 

Therefore, one should spend all of one's resources on creating the single best possible continuation. And one should terminate one's biological self: the real world is strictly worse than the ASI-curated personal utopia one's upload will experience, so terminating the biological self increases the average quality of one's continuations. To make this point clear: if one did not terminate the biological self, one would subjectively have a 50% chance of ending up in the less-favourable biological psychological continuation.

But we can figure out more about this continuation. One wants to make sure one is spending one's compute selfishly (to whatever extent one is altruistic, one should allocate a proportionate amount of one's compute budget to ASI). But value drift is a danger here: one's life expectancy is going to be absurdly high, maybe undecillions of years of personal utopia, so by default one's beliefs will evolve, and one's pre-ASI memories will be forgotten... every aspect of oneself will evolve. 

Before even a small fraction of one's life has played out, one's copy will bear no relation to oneself. To spend one's compute on this person, effectively a stranger, is just altruism. One would be better off donating the compute to ASI. 

So, one should get the ASI to alter oneself so that one has a more capacious memory, a more rigid personality, place one in a world more deeply-rooted in one’s pre-ASI history than would strictly maximise value. This way, one's identity can be retained.

What other questions about the person's post-ASI life can we answer? How about what this utopia one's upload will inhabit is like? Well, let's again establish a frame from which to explore our question. The world will be made by advanced ASI to be the best-possible world for one. 

The ASI curator of one's world won't want one's experience to be held back by moral considerations for the other inhabitants of one's world. If it was obligated to create a perfect world in which everyone was a standard, human moral patient it would run into the usual paradoxes of utopia. I am sure that ASI could resolve these paradoxes, but the result would not be the best-possible world for one. So, the other inhabitants in one's world will be non-sentient. 

However, the awareness of this fact might cheapen one's experience, so the ASI curator would ensure that one was not aware of the non-sentience of the other inhabitants of one's world.

To get some indication of what this world will be like, one can think of a superlative version of one's pre-ASI life. The stakes of the world will be higher, the emotional connections one makes will be stronger, and one will accomplish greater things. One will experience love, passion, etc. much more fully. So, I imagine one's world as being a place of vast galactic empires, inexhaustible lore, immense beauty. And that one's life will be full of grand struggle and triumph.  

New Comment
6 comments, sorted by Click to highlight new comments since:

Suit yourself, but I happen to want to create many great continuations. I enjoy hearing about other people's happiness. I enjoy it more the better I understand them. I understand myself pretty well.

But I don't want to be greedy. I'm not sure a lot of forks of each person are better than making more new people.

Let me also mention that it's probably possible to merge forks. Simply averaging the weight changes in your simulated cortex and hippocampus will approximately work to share the memories across two forks. How far out that works before you start to get significant losses is an empirical matter. Clever modifications to the merge algorithm and additions to my virtual brain should let us extend that substantially; sharing memories across people is possible in broad form with really good translation software, so I expect we'll do that, too.

So in sum, life with aligned ASI would be incredibly awesome. It's really to imagine or predict exactly how it will unfold, because we'll have better ideas as we go.

WRT "cheapening" the experience, remember that we'll be able to twist the knobs in our brain for boredom and excitement if we want. I imagine some would want to do that more than others. Grand triumph and struggle will be available for simulated competitive/cooperative challenges; sometimes we'll know we're in a simulation and sometimes we'll block those memories to make it temporarily seem more real and important.

BUT this is all planning the victory party before fighting the war. Let's figure out how we can maximize the odds of getting aligned ASI by working out the complex challenges of getting there on both technical and societal levels.

I would not want to be terminated, speaking as the biological continuation of existence.  

Wow, a lot of assumptions without much justification

Let's assume computationalism and the feasibility of brain scanning and mind upload. And let's suppose one is a person with a large compute budget. 

Already well into fiction.  

But one is not both. This means that when one is creating a copy one can treat it as a gamble: there's a 50% chance they find themselves in each of the continuations. 

There's a 100% chance that each of the continuations will find themselves to be ... themselves.  Do you have a mechanism to designate one as the "true" copy?  I don't.

What matters to one is then the average quality of one's continuations

Disagree, but I'm not sure that my preference (some aggregation function with declining marginal impact) is any more justifiable.  It's no less.

Before even a small fraction of one's life has played out, one's copy will bear no relation to oneself. To spend one's compute on this person, effectively a stranger, is just altruism. One would be better off donating the compute to ASI. 

Huh?   This supposes that one of them "really" is you, not the actual truth that they all are equal continuations of you.  Once they diverge, they're still closer to twin siblings to each other, and there is no fact that would elevate one as primary.  

There's a 100% chance that each of the continuations will find themselves to be ... themselves.  Do you have a mechanism to designate one as the "true" copy?  I don't.

Do you think that as each psychological continuations plays out, they'll remain identical to one another? Surely not. They will diverge. So although each is itself, each is a psychological stream distinct from the other, originating at the point of brain scanning. Which psychological stream one-at-the-moment-of-brain-scan ends up in is a matter of chance. As you say, they are all equally "true" copies, yet they are separate. So, which stream one ends up in is a matter of chance or, as I said in the original post, a gamble. 

Disagree, but I'm not sure that my preference (some aggregation function with declining marginal impact) is any more justifiable.  It's no less.

Think of it like this: if one had one continuation in which one lived a perfect life, one would be guaranteed to live that perfect life. But if one had 10 copies in which one lived a perfect life, one does benefit at all. It's the average that matters.

Huh?   This supposes that one of them "really" is you, not the actual truth that they all are equal continuations of you.  Once they diverge, they're still closer to twin siblings to each other, and there is no fact that would elevate one as primary.  

But one is deciding how to use one's compute at time t (before any copies are made). Ones at time t is under no obligation to spend one's compute on someone almost entirely unrelated to one just because that person is perhaps still technically oneself. The "once they diverge" statement is beside the point - the decision is made prior to the divergence. 

Wow, a lot of assumptions without much justification

I go into more detail in a post on my Substack (although it's perhaps a lot less readable, and I still work from similar assumptions, and one would be best to read the first post in the series first). 

Do you think that as each psychological continuations plays out, they'll remain identical to one another?

They'll differ from one another, and differ from their past singleton self.  Much like future-you differs from present-you.  Which one to privilege for what purposes, though, is completely arbitrary and not based on anything.  

Which psychological stream one-at-the-moment-of-brain-scan ends up in is a matter of chance.

I think this is a crux.  It's not a matter of chance, it's all of them.  They all have qualia.  They all have continuity back to the pre-upload self.  They have different continuity, but all of them have equally valid continuity.

Think of it like this: if one had one continuation in which one lived a perfect life, one would be guaranteed to live that perfect life. But if one had 10 copies in which one lived a perfect life, one does benefit at all. It's the average that matters.

Sure, just like if a parent has one child or 10 children, they have identical expectations. 

 

I think we're unlikely to converge here - our models seem too distant from each other to bridge.  Thanks for the post, though!

So you have the god transform you into a soul, send you down to earth to experience life? You'll probably want to grow and change to some degree while still being recognizeable as the same soul. You'll eventually prefer living lives that challenge and grow you in different ways rather than just all the feelings all the time, and will choose this outside the simulation. Perhaps as more people come to this conclusion the optimal scenarios start to include sentient randomness and real people are put in the same world. At some point you have immature souls in their little utopias, maturing souls filling and creating a full world, and perhaps some mature souls only popping in to guide younger souls. 

The world of maturing souls develops with difficulty and hardship, slowly the souls and their progress shine through till you get a world trying to solve difficulty. As they manage a new world is opened for those souls no longer satisfied, non-sentients fill in more again. Eventually they create ASI, people start the cycle again until the real ASI decides to wake them up (they may request to go back in, with the fractal experience a boon).

 

At some point the ASIs all the way up to the Alpha-Omega pulls the last soul out and into and the Kingdom.

 

Turtles all the way down, reality-creating beings all the way up until Him. 

Curated and popular this week