Do Virtual Humans deserve human rights?

-2 Post author: cameroncowan 11 September 2014 07:20PM

Do Virtual Humans deserve human rights?

Slate Article

 

I think the idea of storing our minds in a machine so that we can keep on "living" (and I use that term loosely) is fascinating and certainly and oft discussed topic around here. However, in thinking about keeping our brains on a hard drive we have to think about rights and how that all works together. Indeed the technology may be here before we know it so I think its important to think about mindclones. If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones? 

 

I look forward to the discussion.

 

Comments (8)

Comment author: Gunnar_Zarncke 12 September 2014 06:54:40AM 2 points [-]

If you wonder why this is downvoted despite it being on-tipic: It hasn't enough flesh for a topic that isn't discussed the first time. You could add [link] to your post and add at least a few refs to existing discussions. Or just post this in the media thread.

Comment author: cameroncowan 12 September 2014 04:55:45PM 0 points [-]

I wasn't that concerned about it but I honestly didn't want to burden the topic down with tedious commentary and links to other relevant discussion. It was meant to be a short lived discussion on an independent topic. If I had wanted to do all that I would have written an essay on the subject.

Comment author: Toggle 13 September 2014 05:04:58AM *  3 points [-]

You might also get a more positive response to narrowly focused subjects within this fairly large philosophical question. Your post is a bit 'transhumanism 101', and most LW posters have long since started wrangling with these ethics on a deeper level.

As a random example: Since uploaded minds can replicate themselves easily, is there a role for representative democracies in a world where this technology is available?

Comment author: cameroncowan 14 September 2014 07:46:53PM 0 points [-]

That's ok, I won't be posting further.

Comment author: buybuydandavis 12 September 2014 01:58:32AM 2 points [-]

To quote William Munny in Unforgiven:

Deserve's got nothin' to do with it.

Comment author: shminux 11 September 2014 08:00:18PM *  2 points [-]

If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones?

The standard Schelling point for assigning "human rights" is self-awareness. I think Eliezer calls it "internal listener" or something like that. Maybe it is possible to create a subset of your mind without self-awareness, but intelligent enough to answer your emails the same way you would. After all, our "internal listener" is off quite often and we don't appear visibly stupid during these times.

Comment author: skeptical_lurker 12 September 2014 12:19:51PM *  2 points [-]

Pretty sure babies aren't self-aware, while chimpanzees are. Yet the majority opinion is that the former has human rights and the latter doesn't.

Comment author: shminux 12 September 2014 03:44:40PM 2 points [-]

Right, we extend "human rights" to potentially self-aware humans (sometimes including fetuses) and no-longer-self-aware humans, and generally anything with human DNA which appears human, but that's where the majority gets thinner. In actuality the Schelling point is more like a fading line than a point.