I wasn't that concerned about it but I honestly didn't want to burden the topic down with tedious commentary and links to other relevant discussion. It was meant to be a short lived discussion on an independent topic. If I had wanted to do all that I would have written an essay on the subject.
You might also get a more positive response to narrowly focused subjects within this fairly large philosophical question. Your post is a bit 'transhumanism 101', and most LW posters have long since started wrangling with these ethics on a deeper level.
As a random example: Since uploaded minds can replicate themselves easily, is there a role for representative democracies in a world where this technology is available?
Do Virtual Humans deserve human rights?
Slate Article
I think the idea of storing our minds in a machine so that we can keep on "living" (and I use that term loosely) is fascinating and certainly and oft discussed topic around here. However, in thinking about keeping our brains on a hard drive we have to think about rights and how that all works together. Indeed the technology may be here before we know it so I think its important to think about mindclones. If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones?
I look forward to the discussion.