You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

shminux comments on Do Virtual Humans deserve human rights? - Less Wrong Discussion

-2 Post author: cameroncowan 11 September 2014 07:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (8)

You are viewing a single comment's thread.

Comment author: shminux 11 September 2014 08:00:18PM *  2 points [-]

If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones?

The standard Schelling point for assigning "human rights" is self-awareness. I think Eliezer calls it "internal listener" or something like that. Maybe it is possible to create a subset of your mind without self-awareness, but intelligent enough to answer your emails the same way you would. After all, our "internal listener" is off quite often and we don't appear visibly stupid during these times.

Comment author: skeptical_lurker 12 September 2014 12:19:51PM *  2 points [-]

Pretty sure babies aren't self-aware, while chimpanzees are. Yet the majority opinion is that the former has human rights and the latter doesn't.

Comment author: shminux 12 September 2014 03:44:40PM 2 points [-]

Right, we extend "human rights" to potentially self-aware humans (sometimes including fetuses) and no-longer-self-aware humans, and generally anything with human DNA which appears human, but that's where the majority gets thinner. In actuality the Schelling point is more like a fading line than a point.