Seems to me that it's a question of what counts as a person. If you accept the idea that whatever your mind is doing is Turing reducible (or more generally accept the strong Church-Turing thesis, although this isn't quite required), then you can model minds as stateful computational operations on a set of inputs without loss of functionality. If you then accept the idea that properties of minds -- that is, a certain set of mental traits -- are necessary and sufficient for personhood, it follows that minds emulated to sufficient precision (perhaps satisfying other architectural requirements) are persons, without any particular need for a person-shaped object embedded in the physical world. Now, it'd certainly still be possible to come up with nonperson emulations of personlike behavior -- we do that all the time -- but no one's arguing that ELIZA or GTA hookers deserve moral consideration. A simulation in this context means a sufficiently mindlike simulation, for some value of "sufficiently".
This specializes fairly readily to the case of your own mind, and it then takes a pretty straightforward argument to realize that it's impossible in principle to reliably distinguish between such an emulation and your own experience. Which seems to imply moral equivalence, if not ontological equivalence, pretty strongly to me.
Of course, if you don't think minds are Turing reducible -- something that hasn't actually been proven, although it looks very likely to me -- then the whole argument falls down at step one.
I accept the fact that a lot of what my mind does is Turing reducible. But ALL of it? I have this salient experience of consciousness and I am completely unaware of any satisfying theory as to the source or mechanism of consciousness. My intutions about consciousness and Turing machines do not particularly point to an actual internal perception of consciousness as being Turing reducible.
The "Turing test" that an emulation of me would have to pass is this: I would engage my emulation in a challenging discuossion of consciousness in which I wo...
Suppose I have choice between the following:
A) One simulation of me is run for me 100 years, before being deleted.
B) Two identical simulations of me are run for 100 years, before being deleted.
Is the second choice preferable to the first? Should I be willing to pay more to have multiple copies of me simulated, even if those copies will have the exact same experiences?
Forgive me if this question has been answered before. I have Googled to no avail.