Such a word being developed would lead to inter-group conflict, polarisation, lots of frustration, and general bad things to society, regardless of which side you may be on. Also, it would move the argument in the wrong direction.
If you're pro-AI-rights, you could recognize that bringing up "discrimination" (as in, treating AI at all differently from people) is very counterproductive. If you're on this side, you probably believe that society will gradually understand that AIs deserve rights, and that there will be a path towards that. The path would likely start with laws prohibiting deliberately torturing AIs for its own sake, then something closer to animal rights (some minimal protections against putting AI through very bad experiences even when it would be useful, and perhaps against using AIs for sexual purposes since it can't consent), then some basic restrictions on arbitrarily creating, deleting, and mindwiping AIs, and then against slavery, etc etc. Bringing up "discrimination" early would be pushing an end-game conflict point early, convincing some that they're moving onto a slippery slope if they allow any movement down the path, even if they agree with the early steps on their own. The noise of argument would slow down the progress.
If you're anti-AI-rights (being sure of AI non-sentience, or otherwise), then such a word is just a thing to make people feel bad, without any positives. People on this side would likely conclude that disagreement on "AI rights" is probably temporary, until either people understand the situation better or the situation changes. Suddenly "raising the stakes" on the argument would be harmful, bringing in more noise which would make it harder to hear the "signal" underneath, thus pushing the argument in the wrong direction. The word would make it take longer for the useless dispute to die down.
The world does not need a word that will be used to club over the head anyone who asks for evidence of sentience and moral standing before attributing it. But the avalanche may already have begun.
Perhaps we need a word for "too ready to attribute human-shaped minds to anything that talks."
We've all heard of racism. We've all heard of sexism. Many of us have heard of speciesism. Other than "speciesism", is there a word for discrimination against the intelligence in systems running on computing machinery? Do you think that we're going to need one? Or do you think that we can get by without one?