I agree that Maia2 is as qualified as Maia, but I'm not sure about legal identity. I guess I'm not sure what you mean by "fork".
If the legal identity were the same, and Maia2 were to commit murder immediately after being made, should Maia1 be punished?
If Maia2 were to kill Maia1 immediately after creation, then is it just passing the baton so to speak? Would it just be Maia1 creating Maia2 with the intent of suicide? This could happen in say the lightspeed transportation scheme in which the information needed to construct Maia is sent far away to be constructed on Mars or Europa or something. Maia2 is made in the destination then Maia1 is destroyed, leading to one subjective Maia in the desired location.
I guess I'm not sure what you mean by "fork".
By fork I mean, up to the point of duplication you have Maia. After the point of duplication you have Maia1 and Maia2, both of whom are Maia, but neither of whom are each other.
If the legal identity were the same, and Maia2 were to commit murder immediately after being made, should Maia1 be punished?
Did Maia, prior to duplication, intend the murder? If so, as both Maia1 and Maia2 are Maia, they're both guilty of planning the murder. Only Maia2 committed it, but Maia1 planned it just the same.
...I
The subject of copying people and its effect on personal identity and probability anticipation has been raised and, I think, addressed adequately on Less Wrong.
Still, I'd like to bring up some more thought experiments.
Recently I had a dispute on an IRC channel. I argued that if some hypothetical machine made an exact copy of me, then I would anticipate a 50% probability of jumping into the new body. (I admit that it still feels a little counterintuitive to me, even though this is what I would rationally expect.) After all, they said, the mere fact the copy was created doesn't affect the original.
However, from an outside perspective, Maia1 would see Maia2 being created in front of her eyes, and Maia2 would see the same scene up to the moment of forking, at which point the field of view in front of her eyes would abruptly change to reflect the new location.
Here, it is obvious from both an inside and outside perspective which version has continuity of experience, and thus from a legal standpoint, I think, it would make sense to regard Maia1 as having the same legal identity as the original, and recognize the need to create new documents and records for Maia2 -- even if there is no physical difference.
Suppose, however, that the information was erased. For example, suppose a robot sedated and copied the original me, then dragged Maia1 and Maia2 to randomly chosen rooms, and erased its own memory. At this point, neither either of me, nor anyone else would be able to distinguish between the two. What would you do here from a legal standpoint? (I suppose if it actually came to this, the two of me would agree to arbitrarily designate one as the original by tossing an ordinary coin...)
And one more moment. What is this probability of subjective body-jump actually a probability of? We could set up various Sleeping Beauty-like thought experiments here. Supposing for the sake of argument that I'll live at most a natural human lifespan no matter which year I find myself in, imagine that I make a backup of my current state and ask a machine to restore a copy of me every 200 years. Does this imply that the moment the backup is made -- before I even issue the order, and from an outside perspective, way before any of this copying happens -- I should anticipate subjectively jumping into any given time in the future, and the probability of finding myself as any of them, including the original, tends towards zero the longer the copying machine survives?