Do Virtual Humans deserve human rights?

Slate Article

 

I think the idea of storing our minds in a machine so that we can keep on "living" (and I use that term loosely) is fascinating and certainly and oft discussed topic around here. However, in thinking about keeping our brains on a hard drive we have to think about rights and how that all works together. Indeed the technology may be here before we know it so I think its important to think about mindclones. If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones? 

 

I look forward to the discussion.

 

New to LessWrong?

New Comment
8 comments, sorted by Click to highlight new comments since: Today at 11:17 PM

To quote William Munny in Unforgiven:

Deserve's got nothin' to do with it.

If you wonder why this is downvoted despite it being on-tipic: It hasn't enough flesh for a topic that isn't discussed the first time. You could add [link] to your post and add at least a few refs to existing discussions. Or just post this in the media thread.

I wasn't that concerned about it but I honestly didn't want to burden the topic down with tedious commentary and links to other relevant discussion. It was meant to be a short lived discussion on an independent topic. If I had wanted to do all that I would have written an essay on the subject.

You might also get a more positive response to narrowly focused subjects within this fairly large philosophical question. Your post is a bit 'transhumanism 101', and most LW posters have long since started wrangling with these ethics on a deeper level.

As a random example: Since uploaded minds can replicate themselves easily, is there a role for representative democracies in a world where this technology is available?

That's ok, I won't be posting further.

If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones?

The standard Schelling point for assigning "human rights" is self-awareness. I think Eliezer calls it "internal listener" or something like that. Maybe it is possible to create a subset of your mind without self-awareness, but intelligent enough to answer your emails the same way you would. After all, our "internal listener" is off quite often and we don't appear visibly stupid during these times.

Pretty sure babies aren't self-aware, while chimpanzees are. Yet the majority opinion is that the former has human rights and the latter doesn't.

Right, we extend "human rights" to potentially self-aware humans (sometimes including fetuses) and no-longer-self-aware humans, and generally anything with human DNA which appears human, but that's where the majority gets thinner. In actuality the Schelling point is more like a fading line than a point.