Bakkot comments on Welcome to Less Wrong! (2012) - Less Wrong

25 Post author: orthonormal 26 December 2011 10:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1430)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bakkot 02 January 2012 03:00:34AM *  5 points [-]

You might say that they are obviously wrong, but the "obviously" is suspicious when so many disagree with you, at the very least for Aumann reasons.

While true, I suspect most or all of those people would have a hard time giving a good definition of "person" to an AI in such a way that the definition included babies, adults, and thinking aliens, but not pigs or bonobos. So yes, the claim I am implicitly making with this (or any other) controversial opinion is that I think almost everyone is wrong about this specific topic.

That is much closer to birth than 17 is to 12 years old

Only if you think development happens linearly. From my knowledge of biology - which is extremely suspect, so do correct me if I'm wrong - the changes between 0 months after birth and 10 months after birth are vastly larger than the changes between 12 years after birth and 17 years after birth.

if there was a visible unisex menstruation happening at 17 years old, and an established tradition of taking that as the age of consent, why on earth would a society change the law to make it 16 years and 2 months instead?

Of course our selection of analogies is reflective of our positions. From my point of view, the most relevant analogy would be a visible unisex menstruation happening at 30 years old*, and an established tradition of taking that as the age of consent, and I'm arguing that no, it should really be 16.

*(This is based on the assumption that you go through about as many developmental changes between ages 16 and 30 as you do between 0 months and 10 months, which - again from my extremely suspect recollection of biology - is roughly correct.)

(edit:formatting)

Comment author: Alejandro1 02 January 2012 03:35:35AM 6 points [-]

While true, I suspect most or all of those people would have a hard time giving a good definition of "person" to an AI in such a way that the definition included babies, adults, and thinking aliens, but not pigs or bonobos. So yes, the claim I am implicitly making with this (or any other) controversial opinion is that I think almost everyone is wrong about this specific topic.

One rough effort at such definition would be: "any post-birth member of a species whose adult members are intelligent and conscious", where "birth" can be replaced by an analogous Schelling point in the development in an alien species, or by an arbitrary chosen line at a similar stage of development, if no such Schelling point exists.

You might say that this definition is an arbtrary kludge that does not "carve Nature at the joints". My reply would be that ethics is adapted for humans, and does not need to carve Nature at intrinsic joints but at the places that humans find relevant.

Your point about different rates of development is well taken, however. I am also not an expert in this topic, so we'll have to let it rest for the moment.

Comment author: Bakkot 02 January 2012 03:46:55AM 2 points [-]

One rough effort at such definition would be: "any post-birth member of a species whose adult members are intelligent and conscious", where "birth" can be replaced by an analogous Schelling point in the development in an alien species, or by an arbitrary chosen line at a similar stage of development, if no such Schelling point exists.

Problem: There's no particular reason to expect speciation to be as widespread or as clear-cut as it is in the case of Earth and humans in particular. Certainly not for machine intelligences.

It might so happen that there could be software written for the computer I'm typing this on which could give it intelligence and consciousness. (Unlikely, but not out of the realm of possibility.) Should this machine be considered a person?

The reason I'm being so nit-picky is that what I consider the natural definition (namely, "an agent capable of intelligence and consciousness", or something like that) doesn't have this problem at all. I think it's a problem your definition has only because you were forced to deviate from the natural definition to include something that doesn't really seem like it belongs in that group - namely, newborns.

Comment author: Alejandro1 02 January 2012 03:08:44PM 0 points [-]

For computers, hardware and software can be separated in a way that is not possible with humans (with current technology). When the separation is possible, I agree personhood should be attributed to the software rather than the hardware, so your machine should not be considered a person. If in the future it becomes routinely possible to scan, duplicate and emulate human minds, then killing a biological human will probably also be less of a crime than it is now, as long as his/her mind is preserved. (Maybe there would be a taboo instead about deleting minds with no backup, even when they are not "running" on hardware).

It is also possible than in such a future where the concept of a person is commonly associated with a mind pattern, legalizing infanticide before brain development seats in would be acceptable. So perhaps we are not in disagreement after all, since on a different subthread you have said you do not really support legalization of infanticide in our current society.

I still think there is a bit of a meta diagreement: you seem to think that the laws and morality of this hypothetical future society would be better than our current ones, while I see it as a change in what are the appropriate Schelling points for the law to rule, in response to technological changes, without the end point being more "correct" in any absolute sense than our current law.

Comment author: Multiheaded 02 January 2012 08:33:32AM -1 points [-]

Should this machine be considered a person?

Well, yes. This seems obvious to me.

Comment author: Bakkot 02 January 2012 06:49:05PM 2 points [-]

I think I must have been unclear - the machine I'm currently typing on should obviously be a person, just because it has the potential to become a person? That seems absurd to me.

Comment author: Multiheaded 02 January 2012 08:53:14PM 0 points [-]

Oh, of course. I've taken it that you were asking about a case where such software had indeed been installed on the machine. The potential of personhood on its own seems hardly worth anything to me.