Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

denis_bider comments on The Logical Fallacy of Generalization from Fictional Evidence - Less Wrong

39 Post author: Eliezer_Yudkowsky 16 October 2007 03:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (56)

Sort By: Old

You are viewing a single comment's thread.

Comment author: denis_bider 20 November 2007 07:48:16PM 0 points [-]

Louis: "The more recent example is the TV series BattleStar Galactica. Of course it's unrealistic and biased, but it changed my views on the issues of AGI's rights. Can a robot be destroyed without a proper trial? Is it OK to torture it? to rape it? What about marrying one? or having children with it (or should I type 'her')?"

See this: http://denisbider.blogspot.com/2007/11/weak-versus-strong-law-of-strongest_15.html

You are confused because you misinterpret humanity's traditional behavior towards other apparently sentient entities in the first place. Humanity's traditional (and game-theoretically correct) behavior is to (1) be allies with creatures who can hurt us, (2) go to war with creatures who can hurt us and don't want to be our allies, (3) plunder and exploit creatures that cannot hurt us, regardless of how peaceful they are or how they feel towards us.

This remains true historically whether we are talking about other people, about other nations, or about other animals. There's no reason why it shouldn't be true for robots. We will ally with and "respect" robots that can hurt us; we will go to war with robots that can hurt us but do not want to be our allies; and we will abuse, mistreat and disrespect any creature that does not have the capacity to hurt us.

Conversely, if the robots reach or exceed human capacities, they will do the same. Whoever is the top animal will be the new "human". That will be the new "humanity" where their will be reign of "law" among entities that have similar capacities. Entities with lower capacities, such as humans that continue to be mere humans, will be relegated to about the same level as capucin monkeys today. Some will be left "in the wild" to do as they please, some will be used in experiments, some will be hunted, some will be eaten, and so forth.

There is no morality. It is an illusion. There will be no morality in the future. But the ruthlessness of game theory will continue to hold.

Comment author: DanielLC 15 November 2010 04:43:35AM 4 points [-]

Human nature will hold. Similarly robot nature, whatever we design it to be, will hold. Robots won't mistreat humans unless it's the way they're made. They very well may be made that way by accident, but we can't just assume that they will be.