In the future, it may be possible for you to scan your own brain and create copies of yourself. With the power of a controllable superintelligent AI, it may even be possible to create very accurate instances of your past self (and you could take action today or in the near future to make this easier by using lifelogging tools such as these glasses).
So I ask Less Wrong: how valuable do you think creating extra identical, non-interacting copies of yourself is? (each copy existing in its own computational world, which is identical to yours with no copy-copy or world-world interaction)
For example, would you endure a day's hard labor to create an extra self-copy? A month? A year? Consider the hard labor to be digging a trench with a pickaxe, with a harsh taskmaster who can punish you if you slack off.
Do you think having 10 copies of yourself made in the future is 10 times as good as having 1 copy made? Or does your utility in copies drop off sub-linearly?
Last time I spoke to Robin Hanson, he was extremely keen on having a lot of copies of himself created (though I think he was prepared for these copies to be emulant-wage-slaves).
I have created a poll for LW to air its views on this question, then in my next post I'll outline and defend my answer, and lay out some fairly striking implications that this has for existential risk mitigation.
For those on a hardcore-altruism trip, you may substitute any person or entity that you find more valuable than your own good self: would you sacrifice a day of this entity's life for an extra copy? A year? etc.
UPDATE: Wei Dai has asked this question before, in his post "The moral status of independent identical copies" - though his post focuses more on lock-step copies that are identical over time, whereas here I am interested in both lock-step identical copies and statistically identical copies (a statistically identical copy has the same probability distribution of futures as you do).
That would be an pretty big "original privilege" :-)
Generally, when I think about making copies, I assume that the status of being "original" would be washed away and I would find myself existing with some amount of certainty (say 50% to 100%) that I was the copy. They I try to think about how I'd feel about having been created by someone who has all my memories/skills/tendencies/defects but has a metaphysically arbitrary (though perhaps emotionally or legally endorsed) claim to being "more authentic" than me by virtue of some historical fact of "mere physical continuity".
I would only expect a copy to cooperate with my visions for what my copy "should do" if I'm excited by the prospect of getting to do that - if I'm kinda hoping that after the copy process I wake up as the copy because the copy is going to have a really interesting life.
In practice, I would expect that what I'd really have to do is write up two "divergence plans" for each future version of me, that seem equally desirable, then copy, then re-negotiate with my copy over the details of the divergence plans (because I imagine the practicalities of two of us existing might reveal some false assumptions in the first draft of the plans), and finally we'd flip a coin to find out which plan each of us is assigned to.
I guess... If only one of us gets the "right of making more copies" I'd want the original contact to make "copyright" re-assignable after the copying event, so I could figure out whether "copyright" is more of a privilege or a burden, and what the appropriate compensation is for taking up the burden or losing the privilege.
ETA:: Perhaps our preferences would diverge during negotiation? That actually seems like something to hope for because then a simple cake cutting algorithm could probably be used to ensure the assignment to a divergence plan was actually a positive sum interaction :-)