Bakkot comments on Welcome to Less Wrong! (2012) - Less Wrong

25 Post author: orthonormal 26 December 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1430)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bakkot 06 June 2012 03:20:51AM *  1 point [-]

Posted this above as well.

The reason they all feel like babies to me, from the perspective of "are they people?", is that they're in a condition where we can see a reasonable path for turning them into something that is unquestionably a person.

Here's another case to consider:

I assume you've granted that sufficiently advanced AIs ought to be counted as people. Say that I have running on my computer a script which is compiling an AI's source, and which will launch the resultant executable as soon as compilation finishes with no intervention on my part.

Am I killing a person if I terminate this script before compilation completes? That is, does "software which will compile and run an AI" belong to the "people" or the "not people" group?

I think babies are much closer to this than to any of the examples you've listed above.


In the interests of settling confusion, here's another example:

Suppose we let the above script finish and the AI go about its merry way for a few centuries. We shut down the computer it's running on - writing its current state to non-volatile memory - to transport it somewhere else. To me it seems that destroying that memory would constitute killing a person.

From these examples, I think "will become a person" is only significant for objects which were people in the past. This handles all of the examples you list (leaving aside 3-year-olds, which are too close to the issue at hand), as well as explaining why I don't think interrupting compilation as above is killing a person but destroying the state of a running-but-paused AI does.


Questions for you:

  • Does interrupting compilation as above seem to you like killing someone?
  • If not, do you still think babies are closer to the examples you list than to this example?
  • If not, do you still think babies are people?
  • If so, can you think of some other example which we can both readily agree is a person (or not a person) which can help settle this?
Comment author: wmorgan 07 June 2012 12:49:05AM 0 points [-]

I've never seen a compiling AI, let alone an interrupted one, even in fiction, so your example isn't very available to me. I can imagine conditions that would make it OK or not OK to cancel the compilation process.

This is most interesting to me:

From these examples, I think "will become a person" is only significant for objects which were people in the past

I know we're talking about intuitions, but this is one description that can't jump from the map into the territory. We know that the past is completely screened off by the present, so our decisions, including moral decisions, can't ultimately depend on it. Ultimately, there has to be something about the present or future states of these humans that makes it OK to kill the baby but not the guy in the coma. Could you take another shot at the distinction between them?

Comment author: Bakkot 07 June 2012 04:10:34AM 1 point [-]

I'm having a hard time figuring out what you mean when you say that example isn't available to you. Are you familiar with the processes by which source code becomes programs we can execute? (I imagine you are, but if not, I suggest you read up on it - this is something everyone can benefit from a bare familiarity with, I think.)

This seems by far the most appropriate example, so I'm not going to readily give it up. I'd be happy to give a more detailed example, if you'd like - or, possibly better, you could give an example of a case where it would be OK to cancel the compilation process.


We know that the past is completely screened off by the present, so our decisions, including moral decisions, can't ultimately depend on it.

Extremely strongly disagree. There is absolutely no reason to exclude the past from our morality.

Here's just one trivial but hopefully obvious example of why the past is important: if everyone follows just the rule I listed, no one has to worry about getting killed.

Of course, that said, I can still try to present the rule another way. For example, we might say that the rule is that it's immoral to destroy anything which contains within it a complete and unique description (that it to say, it's the only copy) of a person. Thus the passed-out drunk is a person, the running AI is a person, but the compiling AI is not. (Nor is the baby, incidentally.)


Again, the compiling AI seems like an extremely useful example. I'm having difficulty coming up with any rule which includes the AI but not the baby, in fact. (Without referring to things that seems unrelated like "biologically alive", of course.) As such, I'd really like to discuss it with you. Could you do me a favor and explain what would need to happen for you to be able to discuss it?