The problem with allowing such wills (and especially wills which allow specific reference to "the original") is that there will be people who believe they are guaranteed to continue on solely as the original; and who will thus be willing to create copies who're, well, slightly screwed.
You need a lot of protections for the copies. And laws regarding when copying is legal.
Why would such people have a motive for making copies in the first place?
This doesn't need to be a hypothetical question. There are people who read this site, who subscribe to the thread theory of identity. If any of them are reading this comment: would you want to start copying yourself? If so, why? And what percentage of your assets would you allocate to the copy, and why?
I can see one possible answer: make a backup to start running only if I die. The backup wouldn't be me, but would be able to take my place for the benefit of my family and friends, some...
The subject of copying people and its effect on personal identity and probability anticipation has been raised and, I think, addressed adequately on Less Wrong.
Still, I'd like to bring up some more thought experiments.
Recently I had a dispute on an IRC channel. I argued that if some hypothetical machine made an exact copy of me, then I would anticipate a 50% probability of jumping into the new body. (I admit that it still feels a little counterintuitive to me, even though this is what I would rationally expect.) After all, they said, the mere fact the copy was created doesn't affect the original.
However, from an outside perspective, Maia1 would see Maia2 being created in front of her eyes, and Maia2 would see the same scene up to the moment of forking, at which point the field of view in front of her eyes would abruptly change to reflect the new location.
Here, it is obvious from both an inside and outside perspective which version has continuity of experience, and thus from a legal standpoint, I think, it would make sense to regard Maia1 as having the same legal identity as the original, and recognize the need to create new documents and records for Maia2 -- even if there is no physical difference.
Suppose, however, that the information was erased. For example, suppose a robot sedated and copied the original me, then dragged Maia1 and Maia2 to randomly chosen rooms, and erased its own memory. At this point, neither either of me, nor anyone else would be able to distinguish between the two. What would you do here from a legal standpoint? (I suppose if it actually came to this, the two of me would agree to arbitrarily designate one as the original by tossing an ordinary coin...)
And one more moment. What is this probability of subjective body-jump actually a probability of? We could set up various Sleeping Beauty-like thought experiments here. Supposing for the sake of argument that I'll live at most a natural human lifespan no matter which year I find myself in, imagine that I make a backup of my current state and ask a machine to restore a copy of me every 200 years. Does this imply that the moment the backup is made -- before I even issue the order, and from an outside perspective, way before any of this copying happens -- I should anticipate subjectively jumping into any given time in the future, and the probability of finding myself as any of them, including the original, tends towards zero the longer the copying machine survives?