By origin, I'm referring to the source of the need for morality, and it's clear that it's mostly about suffering. We don't like suffering and would rather not experience it, although we are prepared to put up with some (or even a lot) of it if that suffering leads to greater pleasure that outweighs it. We realised long ago that if we do a deal with the people around us to avoid causing each other suffering, we could all suffer less and have better lives - that's far better than spending time hitting each over the head with clubs and stealing the fruits of each other's labour. By doing this deal, we ended up with greater fruits from our work and removed most of the brutality from our lives. Morality is clearly primarily about management of suffering.
You can't torture a rock, so there's no need to have rules about protecting it against people who might seek to harm it. The same applies to a computer, even if it's running AGI - if it lacks sentience and cannot suffer, it doesn't need rules to protect it from harm (other than to try to prevent the owner from suffering any loss if it was to be damaged, or other people who might be harmed by the loss of the work the computer was carrying out). If we were able to make a sentient machine though, and if that sentient machine could suffer, it would then have to be brought into the range of things that need to be protected by morality. We could make an unintelligent sentient machine like a calculator and give it the ability to suffer, or we could make a machine with human-level intelligence with the same ability to suffer, and to suffer to the same degree as the less intelligent calculator. Torturing both of these to generate the same amount of suffering in each would be equally wrong for both. It is not the intelligence that provides the need for morality, but the sentience and the degree of suffering that may be generated in it.
With people, our suffering can perhaps be amplified beyond the suffering that occurs in other animals because there are many ways to suffer, and they can combine. When an animal is chased, brought down and killed by a predator, it most likely experiences fear, then pain. The pain may last for a long time in some cases, such as when wolves eat a musk ox from the rear end while it's still alive, but the victim lacks any real understanding of what's happening to it. When people are attacked and killed though, there are amplifications of the suffering caused by the victim understanding the situation and knowing just how much they are losing. The many people who care deeply about that victim will also suffer because of this loss, and many will suffer deeply for many decades. This means that people need greater protection from morality, although when scores are being put to the degree of suffering caused by pain and fear to an animal victim and a human victim, those should be measured using the same scale, so in that regard these sentiences are being treated as equals.
Hi everybody!
Hi David! I'm citing you answering Dagon:
What you say is true only if the person is part of our group, and it so because we instinctively know that increasing the survival probability of our group increases ours too. Unless we use complete randomness to make a move, we can't make a completely free move. Even Mother Teresa didn't make free moves, she would help others only in exchange of god's love. The only moment we really care for others' feelings is when they yell at us because we harm them, or when they thanks us because we got them out of trouble, thus when we are close enough to communicate, but even what we do then is selfish: we get away from people that yell at us and get closer to those who thank us, thus breaking or building automatically a group in our favor. I'm pretty sure that what we do is always selfish, and I think that you are trying to design a perfectly free AGI, what I find impossible to do if the designer itself is selfish. Do you by chance think that we are not really selfish?
The most extreme altruism can be seen as selfish, but inversely, the most extreme selfishness can also be seen as altruist: it depends on the viewpoint. We may think that Trump is selfish while closing the door to migrants for instance, but he doesn't think so because this way, he is being altruist to the republicans, which is a bit selfish since he needs them to be reelected, but he doesn't feel selfish himself. Selfishness is not about sentience since we can't feel selfish, it is about defending what we are made of, or part of. Hum... (read more)