By origin, I'm referring to the source of the need for morality, and it's clear that it's mostly about suffering. We don't like suffering and would rather not experience it, although we are prepared to put up with some (or even a lot) of it if that suffering leads to greater pleasure that outweighs it. We realised long ago that if we do a deal with the people around us to avoid causing each other suffering, we could all suffer less and have better lives - that's far better than spending time hitting each over the head with clubs and stealing the fruits of each other's labour. By doing this deal, we ended up with greater fruits from our work and removed most of the brutality from our lives. Morality is clearly primarily about management of suffering.
You can't torture a rock, so there's no need to have rules about protecting it against people who might seek to harm it. The same applies to a computer, even if it's running AGI - if it lacks sentience and cannot suffer, it doesn't need rules to protect it from harm (other than to try to prevent the owner from suffering any loss if it was to be damaged, or other people who might be harmed by the loss of the work the computer was carrying out). If we were able to make a sentient machine though, and if that sentient machine could suffer, it would then have to be brought into the range of things that need to be protected by morality. We could make an unintelligent sentient machine like a calculator and give it the ability to suffer, or we could make a machine with human-level intelligence with the same ability to suffer, and to suffer to the same degree as the less intelligent calculator. Torturing both of these to generate the same amount of suffering in each would be equally wrong for both. It is not the intelligence that provides the need for morality, but the sentience and the degree of suffering that may be generated in it.
With people, our suffering can perhaps be amplified beyond the suffering that occurs in other animals because there are many ways to suffer, and they can combine. When an animal is chased, brought down and killed by a predator, it most likely experiences fear, then pain. The pain may last for a long time in some cases, such as when wolves eat a musk ox from the rear end while it's still alive, but the victim lacks any real understanding of what's happening to it. When people are attacked and killed though, there are amplifications of the suffering caused by the victim understanding the situation and knowing just how much they are losing. The many people who care deeply about that victim will also suffer because of this loss, and many will suffer deeply for many decades. This means that people need greater protection from morality, although when scores are being put to the degree of suffering caused by pain and fear to an animal victim and a human victim, those should be measured using the same scale, so in that regard these sentiences are being treated as equals.
" "Sentient rock" is an impossible possible object. I see no point in imagining a pebble which, despite not sharing any properties with chairs, is nonetheless truly a chair in some ineffable way."
I could assert that a sentient brain is an impossible possible object. There is no scientific evidence of any sentience existing at all. If it is real though, the thing that suffers can't be a compound object with none of the components feeling a thing, and if any of the components do feel something, they are the sentient things rather than the compound object. Plurality or complexity can't be tortured - if sentience is real, it must be in some physical component, and the only physical components we know of are just as present in rocks as in brains. What they lack in rocks is anything to induce feelings in them in that the brain appears to do.
"You haven't defined suffering well enough for me to infer an equality operation. In other words, as it is, this is tautological and useless."
It's any kind of unpleasant feeling - nothing there that should need defining for people who possess such feelings as they should already have a good understanding of that.
" The same suffering is the same suffering, but perhaps my ratio between ant-suffering and human-suffering varies from yours."
In which case, you have to torture the ant more to generate the same amount of suffering in it as you're generating in the human.
"Perhaps a human death is a thousand times worse than an ant death, and perhaps it is a million times worse. How could we tell the difference?"
We can't, at the moment, but once science has found out how sentience works, we will be able to make precise comparisons. It isn't difficult to imagine yourself into the future at a time when this is understood and to understand the simple point that the same amount of suffering (caused by torture) in each is equally bad.
"Connection to LW concepts: floating belief networks, and statements that are underdetermined by reality."
The mistake is yours - you have banned discussion of the idea of equal suffering on the basis that you can't determine when it's equal.
"By all means you can define suffering however you like, but that doesn't mean that it's a category that matters to other people. I could just as easily say: "Rock-pile-primeness is not dependent on the size of the rock pile, only the number of rocks in the pile. It's just as wrong to turn a 7-pile into a 6-pile as it is to turn a 99991-pile into a 99990-pile." But that does not convince you to treat 7-piles with care."
What is at issue is a principle that equal suffering through torture is equally bad, regardless of what is suffering in each case. We could be comparing a rock's suffering with a person, or a person's suffering with an alien - this should be a universal principle and not something where you introduce selfish biases.
"Bigotry is an unjustified hierarchy. Justification is subjective. Perhaps it is just as bigoted to value this computer over a pile of scrap, but I do not plan on wrecking it any time soon."
When an alien assumes that its suffering is greater than ours, it's making the same mistake as we do when we think our suffering is greater than an ant's. If the amount of suffering is equal in each case, those assumptions are wrong. Our inability to measure how much suffering is involved in each case is a different issue and it doesn't negate the principle.