This is all predicated on the assumption that "sentience" automatically results in moral rights. I would say that moral rights are fundamentally based on empathy, which is subjective -- we give other people moral rights in order to secure those rights for ourselves.
I think the vast majority of the population would have no problem with "apartheid" or "genocide" of sentient AIs or chimps. As a secular humanist, I would reluctantly agree with them. Like it or not, at some level my morality boils down to an emotional attachment to humanity, and transferring that attachment to non-humans would be a big leap.
There are obvious parallels to the evolution of racial attitudes, and maybe someday "humanist" will join "racist" as a pejorative. If that happens, so be it, but I think that change is a long ways away.
This is all predicated on the assumption that "sentience" automatically results in moral rights. I would say that moral rights are fundamentally based on empathy, which is subjective -- we give other people moral rights in order to secure those rights for ourselves.
I think the vast majority of the population would have no problem with "apartheid" or "genocide" of sentient AIs or chimps. As a secular humanist, I would reluctantly agree with them. Like it or not, at some level my morality boils down to an emotional attachment to humanity, and transferring that attachment to non-humans would be a big leap.
There are obvious parallels to the evolution of racial attitudes, and maybe someday "humanist" will join "racist" as a pejorative. If that happens, so be it, but I think that change is a long ways away.