Vladimir_Nesov comments on Work on Security Instead of Friendliness? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (103)
Whoever it is that keeps thumbing down my posts in this thread is invited to bring brutal honesty down onto my ideas, I am not afraid.
If "virtualizing everyone" means what I think you mean by that, that's a euphemism. That it will achieve physical security implies that the physical originals of those people would not exist after the process - otherwise you'd just have two copies of every person which, in theory, could increase their chances of cracking the AI. It sounds like what you're saying here is that the "friendly" AI would copy everyone's mind into a computer system and then kill them.
Maybe it seems to some people like copying your mind will preserve you, but imagine this: Ten copies are made. Do you, the physical original person, experience what all ten copies of you are experiencing at once? No. And if you, the physical original person, ceased to exist, would you continue by experiencing what a copy of you is experiencing? Would you have control over their actions? No.
You'd be dead.
Making copies of ourselves won't save our lives - that would only preserve our minds.
Now, if you meant something else by "virtualize" I'd be happy to go read about it. After turning up with absolutely no instances of the terms "virtualize people" or "virtualize everyone" on the internet (barring completely different uses like "blah blah blah virtualize. Everyone is blah blah.") I have no idea what you mean by "virtualize everyone" if it isn't "copy their minds and then kill their bodies."
The Worst Argument in the World. This is not a typical instance of "dead", so the connotations of typical examples of "dead" don't automatically apply.
Tabooing the word "dead", I ask myself, if a copy of myself was made, and ran independently of the original, the original continuing to exist, would either physical copy object to being physically destroyed provided the other continued in existence? I believe both of us would. Even the surviving copy would object to the other being destroyed.
But that's just me. How do other people feel?
Assuming the copy had biochemistry, or some other way of experiencing emotions, the cop(ies) of me would definitely object to what had happened. Alternately, if a virtual copy of me was created and was capable of experiencing, I would feel that it was important for the copy to have the opportunity to make a difference in the world - that's why I live - so, yes, I would feel upset about my copy being destroyed.
You know, I think this problem has things in common with the individualism vs. communism debate. Do we view the copies as parts of a whole, unimportant in and of themselves, or do we view them all as individuals?
If we were to view them as parts of a whole, then what is valued? We don't feel pain or pleasure as a larger entity made up of smaller entities. We feel it individually. If happiness for as many life forms as possible is our goal, both the originals and the copies should have rights. If they copies are capable of experiencing pain and pleasure, they need to have human rights the same as ours. I would not see it as ethical to let myself be copied if my copies would not have rights.
We should view them as what they actually are, parts of the world with certain morally relevant structure.
Thank you, Vladmir, for your honest criticism, and more is invited. However, this "Worst argument in the world" comparison is not applicable here. In Yvain's post, he explains:
If we do an exercise where we substitute the words "criminal" and "Martin Luther King" with "virtualization" and "death", and read the sentence that results, I think you'll see my point:
The opponent is saying "Because you don't like death, and being virtualized will cause death, you should stop liking the idea of being virtualized by an AGI." But virtualization doesn't share the important features of death like not being able to experience anymore and the inability to enjoy the world that made us dislike death in the first place. Therefore, even though being virtualized by an AGI will cause death, there is no reason to dislike virtualization.
Not being able to experience anymore and not being able to enjoy the world are unacceptable results of "being virtualized". Therefore, we should not like the idea of being virtualized by AGI.