I was recently arguing in /r/transhumanism on reddit about the viability of uploading/forking consciousness, and I realized I didn't have any method of assessing where someone's beliefs actually lay - where I might need to move them from if I wanted to convince them of what I thought.
So I made an intuition ladder. Please correct me if I made any mistakes (that aren't by design), and let me know if you think there's anything past the final level.
Some instructions on how to use this: Read the first level. If you notice something definitely wrong with it, move to the next level. Repeat until you come to a level where your intuition about the entire level is either "This is true" or "I'm not sure." That is your level.
1. Clones and copies (the result of a medical procedure that physically reproduces you exactly, including internal brain state) are the same thing. Every intuition I have about a clone, or an identical twin, applies one-to-one to copies as well, and vice versa. Because identical twins are completely different people on every level except genetically, copies are exactly the same way.
2. Clones and copies aren't the same thing, as copies had a brain and memories in common with me in the past, but for one of us those memories are false and that copy is just a copy, while my consciousness would remain with the privileged original.
3. Copies had a common brain and memories, which make them indistinguishable from each other in principle, so they believe they're me, and they're not wrong in any meaningful sense, but I don't anticipate waking up from any copying procedure in any body but the one I started in. As such, I would never participate in a procedure that claims to "teleport" me by making a copy at a new location and killing the source copy, because I would die.
4. Copies are indistinguishable from each other in principle, even from the inside, and thus I actually become both, and anticipate waking up as either. But once I am one or the other, my copy doesn't share an identity with me. Furthermore, if a copy is destroyed before I wake up from the procedure, I might die, or I might wake up as the copy that is still alive. As such, the fork-and-die teleport is a gamble for my life, and I would only attempt it if I was for some reason comfortable with the chance that I will die.
5. If a copy is destroyed during the procedure, I will wake up as the other one with near certainty, but this is a particular discrete consequence of how soon it's done. If one copy were to die shortly after, I wouldn't be less likely to wake up as that one or anything. I am therefore willing to fork-and-die teleport as long as the procedure is flawless. Furthermore, if I was instead backed up and copied from the backup at a later date, I would certainly wake up immediately after the procedure, and not anticipate waking up subjectively-immediately as the backup copy in the future.
6. I anticipate with less likelihood waking up as a copy that will die soon after the procedure - or for some other reason has a lower amplitude according to the Born rule - as a continuous function, and also it's entirely irrelevant when the copy is instantiated in my anticipation of what I experience, as long as the copy has the mind state I did when the procedure was done. However, consciousness can only transfer to copies made of me. I can never wake up as an identical mind state somewhere in the universe if it wasn't a result of copying, if such a thing were to exist, even in principle.
7. Continuity of consciousness is completely an artifact of mind state, including memory, and need not strictly require adjacency in spacetime at all. If, by some complete miraculous coincidence, in a galaxy far far away, a person exists at some time t' that is exactly identical to me at some time in my life t, in a way a copy made of me at t would be, at the moment t, I anticipate my consciousness transferring to that far away not-copy with some probability. The only reason this doesn't happen is the sheer unlikelihood of an exact mind state being duplicated, memories and all, by happenstance, anywhere in spacetime, even given the age of the universe from beginning to end. However, my consciousness can only be implemented on a human brain, or something that precisely mimics its internal structure.
8. Copies of me need not be or even resemble a human being. I am just an algorithm, and the hardware I am implemented on is irrelevant. If it's done on a microchip or a human brain, any implementation of me is me. However, simulations aren't truly real, so an implementation of me in a simulated world, no matter how advanced, isn't actually me or conscious to the extent I am in the reality I know.
9. Implementations of me can exist within simulations that are sufficiently advanced to implement me fully. If a superintelligence who is able to perfectly model human minds is using that ability to consider what I would do, their model of me is me. Indeed, the only way to model me perfectly is to implement me.
10. In progress, see Dacyn's comment below.
Let me try to see where I am on the ladder by critiquing each. Most of them I start to agree with and then you add a
conclusion that I don't think follows from the lead-in. I think you've got an unstated and likely incorrect assumption
that "me" in the past, "me" as experiencing something in the moment and "me" in the future are all singular and
necessarily linked. I think they've been singular in my experience, but that's not the only way it could be.
If you insist on mapping to QM, my answers match MWI somewhat. Each branch's past has an amplitude of 1, the future
diverges into all possible states which sum to 1. I'm not actually uncertain that "possible" is sensible concept, though, so
I'll answer these as if we were talking about copies in a classical universe, so they can sum to more than 1 of "me".
1: without defining the mechanism of copy, I don't know how it differs from cloning. I think from further questions,
you're positing some copy of existing brain configuration, potentials, and inputs, which is very different from a clone
(a copy of DNA and some amount of similarity of early environment).
2: The first and last parts of these are separate. Why not "both copies have true memories, there are two distinct
entities (as soon as state diverges) both of which have equal claim to being me".
3: First half makes sense, but I anticipate waking up in both. The two mes will each begin to experience just one
line from each of the two.
4: First half fine, but you're weirdly assuming that dieing matters in this form. I think there's no experiential
difference, except only one of me wakes up rather than two. I would not hesitate to undertake this unless the chance
that BOTH would die is greater than the chance that I'd die if I didn't participate.
5: I don't think it's probability of waking up as one or the other. I think it's both, and each will truly be me in
their pasts, and different mes in the future. If only one wakes, then it's more similar to today's experience as
there's only one entity who experiences apparent-continuity. If one wakes then dies, that one is me and experiences
death, the other one is me and doesn't.
6: Incoherent. Likelihood is about exclusive options, and this framing is not exclusive: both happen in the same universe. I predict that branches of me will experience both.
7: Good, then a weird "transfer" concept. A perfect duplicate at any point has the same conscous experience, and
diverges when the inputs diverge. It's not transfered, it's just in both places/times.
8: I'm uncertain what level of fidelity is required to be "me". My intuition is that a sufficiently-true simulation is
effectively me, just like a sufficiently-exact copy.
9: Sure. I don't think it necessarily follows that the ONLY way the superintelligence can "think about what I would do"
is to execute a full-fidelity model, it could very easily use much simpler models (like humans do for each other) for
many questions.
Every level but the last one is supposed to be wrong.
The point is they're supposed to be wrong in a specifically crafted way.