I am a libertarian in general, and if you want your assets owned jointly between you and your copies, I certainly have no objection to that. Presumably something indivisible like your vote in elections would then be used by consensus between all copies?
The only caveat I would suggest is that whichever way you want to do it, it's best to make the decision and sign the appropriate documentation before you step into the copying machine. As with divorces and inheritance in our own time, the last thing you want is to run into a dispute after the fact.
Requiring consensus seems unnecessary. If we get one vote between us, then we get to vote once; that's all the legal system has to concern itself with. Everything else is our own problem.
The courts have no interest in whether we agreed on a result, or whether one of us is currently chained to the wall in our basement, or whatever. (Well, the courts may have an interest in the latter for other reasons, but not as it applies to voting.)
I agree with your suggestion; I'm just saying this isn't a legal complication, just a bit of good personal advice. (That said, my husband and I didn't sign a prenuptual agreement when we got married, so my agreement with this advice is clearly relatively superficial.)
The subject of copying people and its effect on personal identity and probability anticipation has been raised and, I think, addressed adequately on Less Wrong.
Still, I'd like to bring up some more thought experiments.
Recently I had a dispute on an IRC channel. I argued that if some hypothetical machine made an exact copy of me, then I would anticipate a 50% probability of jumping into the new body. (I admit that it still feels a little counterintuitive to me, even though this is what I would rationally expect.) After all, they said, the mere fact the copy was created doesn't affect the original.
However, from an outside perspective, Maia1 would see Maia2 being created in front of her eyes, and Maia2 would see the same scene up to the moment of forking, at which point the field of view in front of her eyes would abruptly change to reflect the new location.
Here, it is obvious from both an inside and outside perspective which version has continuity of experience, and thus from a legal standpoint, I think, it would make sense to regard Maia1 as having the same legal identity as the original, and recognize the need to create new documents and records for Maia2 -- even if there is no physical difference.
Suppose, however, that the information was erased. For example, suppose a robot sedated and copied the original me, then dragged Maia1 and Maia2 to randomly chosen rooms, and erased its own memory. At this point, neither either of me, nor anyone else would be able to distinguish between the two. What would you do here from a legal standpoint? (I suppose if it actually came to this, the two of me would agree to arbitrarily designate one as the original by tossing an ordinary coin...)
And one more moment. What is this probability of subjective body-jump actually a probability of? We could set up various Sleeping Beauty-like thought experiments here. Supposing for the sake of argument that I'll live at most a natural human lifespan no matter which year I find myself in, imagine that I make a backup of my current state and ask a machine to restore a copy of me every 200 years. Does this imply that the moment the backup is made -- before I even issue the order, and from an outside perspective, way before any of this copying happens -- I should anticipate subjectively jumping into any given time in the future, and the probability of finding myself as any of them, including the original, tends towards zero the longer the copying machine survives?