Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

arundelo comments on Nonperson Predicates - Less Wrong

29 Post author: Eliezer_Yudkowsky 27 December 2008 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (175)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Mlynarski 27 April 2017 01:36:50AM *  1 point [-]

"Is a human mind the simplest possible mind that can be sentient?" Of course not. Plenty of creatures with simpler minds are plainly sentient. If a tiger suddenly leaps out at you, you don't operate on the assumption that the tiger lacks awareness; you assume that the tiger is aware of you. Nor do you think "This tiger may behave as if it has subjective experiences, but that doesn't mean that it actually possesses internal mental states meaningfully analogous to wwhhaaaa CRUNCH CRUNCH GULP." To borrow from one of your own earlier arguments.

If you are instead sitting comfortably in front of a keyboard and monitor with no tiger in front of you, it's easy to come up with lots of specious arguments that tigers aren't really conscious, but so what? It's also easy to come up with lots of specious arguments that other humans aren't really conscious. Using such arguments as a basis for actual ethical decision-making strikes me as a bad idea, to put it mildly. What you've written here seems disturbingly similar to a solipsist considering the possibility that he could, conceivably, produce an imaginary entity sophisticated enough to qualify as having a mind of its own. Technically, it's sort of making progress, but....

When I first read your early writing, the one thing that threw me was an assertion that "Animals are the moral equivalent of rocks." At least, I hope that I'm not falsely attributing that to you; I can't track down the source, so I apologize if I'm making a mistake. But my recollection is of its standing out from your otherwise highly persuasive arguments as such blatant unsupported personal prejudice. No was evidence given in favor of this idea and it was followed by a parenthetical that clearly indicated that it was just wishful thinking; it really only made any sense in light of a different assertion that spotting glaring holes in other people's arguments isn't really indicative of any sort of exceptional competence except when dealing with politically and morally neutral subject matter.

Your post and comments here seem to conflate, under the label of "personhood," having moral worth and having a mind somehow closely approximating that of an adult human being. Equating these seems phenomenally morally dubious for any number of reasons; it's hard to see how it doesn't go directly against bedrock fairness, for example.

Comment author: arundelo 27 April 2017 11:20:38PM 0 points [-]

Eliezer probably means "sapient":

"Sentience is commonly used in science fiction and fantasy as synonymous with sapience, although the words aren't synonyms."

(Or maybe by "is sentient", he means to say, "is a person in the moral sense".)

Comment author: TheAncientGeek 28 April 2017 06:36:41AM 0 points [-]

Well, sentient means feeling and sapient means knowing, and that's about all there is to it...neither term is technical precise, although they are often bandied around as though they are.

Comment author: John_Mlynarski 12 May 2017 01:53:41AM *  0 points [-]

But saying that e.g. rats are not sentient in the context of concern about the treatment of sentient beings is like saying that Negroes are not men in the context of the Declaration of Independence. Not only are the purely semantic aspects dubious, but excluding entities from a moral category on semantic grounds seems like a severe mistake regardless.

Words like "sentience" and especially "consciousness" are often used to refer to the soul without sounding dogmatic about it. You can tell this from the ways people use them: "Would a perfect duplicate of you have the same consciousness?", "Are chimps conscious?", etc. You can even use such terminology in such ways if you're a materialist who denies the existence of souls. You'd sound crazy talking about souls like they're real things if you say that there are no such things as souls, wouldn't you? Besides, souls are supernatural. Consciousness, on the other hand, is an emergent phenomenon, which sounds much more scientific.

Is there good reason to think that there is some sort of psychic élan vital? It strikes me as probably being about as real as phlogiston or luminiferous aether; i.e. you can describe phenomena in terms of the concept, and it doesn't necessarily prevent you from doing so basically correctly, but you can do better without it.

And, of course, in the no-nonsense senses of the terms, rats are sentient, conscious, aware, or however else you want to put it. Not all of the time, of course. They can also be asleep or dead or other things, as can humans, but rats are often sentient. And it's not hard to tell that plenty of non-humans also experience mental phenomena, which is why it's common knowledge that they do.

I can't recall ever seeing an argument that mistreating minds without self-awareness or metacognition or whatever specific mental faculty is arbitrarily singled out, is kinder or more just or in any normal sense more moral than mistreating a mind without it. And you can treat any position as a self-justifying axiom, so doing so doesn't work out to an argument for the position's truth in anything but a purely relativist sense.

It is both weird and alarming to see Eliezer arguing against blindly assuming that a mind is too simple to be "sentient" while also pretty clearly taking the position that anything much simpler than our own minds isn't. It really seems like he rather plainly isn't following his own advice, and that that could happen without him realizing it is very worrying. He has admitted that this is something he's confused about and is aware that others are more inclusive, but that doesn't seem to have prompted him to rethink his position all that much, which suggests that Eliezer is really confused about this in a way that may be hard to correct.

Looking for a nonperson predicate is kind of seeking an answer to the question "Who is it okay to do evil things to?" I would like to suggest that the correct answer is "No one", and that asking the question in the first place is a sign that you made a big mistake somewhere if you're trying to avoid being evil.

If having the right not to have something done to you just means that it's morally wrong to do that thing to you, then everything has rights. Making a rock suffer against it will would be, if anything, particularly evil, as it would require you to go out of your way to give the rock a will and the capacity to suffer. Obviously, avoiding violating anything's rights requires an ability to recognize what something wills, what will cause it to suffer, and/or etc. Those are useful distinctions. But it seems like Eliezer is talking about something different.

Has he written anything more recently on this subject?