Followup to: The Moral Void
Three people, whom we'll call Xannon, Yancy and Zaire, are separately wandering through the forest; by chance, they happen upon a clearing, meeting each other. Introductions are performed. And then they discover, in the center of the clearing, a delicious blueberry pie.
Xannon: "A pie! What good fortune! But which of us should get it?"
Yancy: "Let us divide it fairly."
Zaire: "I agree; let the pie be distributed fairly. Who could argue against fairness?"
Xannon: "So we are agreed, then. But what is a fair division?"
Yancy: "Eh? Three equal parts, of course!"
Zaire: "Nonsense! A fair distribution is half for me, and a quarter apiece for the two of you."
Yancy: "What? How is that fair?"
Zaire: "I'm hungry, therefore I should be fed; that is fair."
Xannon: "Oh, dear. It seems we have a dispute as to what is fair. For myself, I want to divide the pie the same way as Yancy. But let us resolve this dispute over the meaning of fairness, fairly: that is, giving equal weight to each of our desires. Zaire desires the pie to be divided {1/4, 1/4, 1/2}, and Yancy and I desire the pie to be divided {1/3, 1/3, 1/3}. So the fair compromise is {11/36, 11/36, 14/36}."
Zaire: "What? That's crazy. There's two different opinions as to how fairness works—why should the opinion that happens to be yours, get twice as much weight as the opinion that happens to be mine? Do you think your theory is twice as good? I think my theory is a hundred times as good as yours! So there!"
Yancy: "Craziness indeed. Xannon, I already took Zaire's desires into account in saying that he should get 1/3 of the pie. You can't count the same factor twice. Even if we count fairness as an inherent desire, why should Zaire be rewarded for being selfish? Think about which agents thrive under your system!"
Xannon: "Alas! I was hoping that, even if we could not agree on how to distribute the pie, we could agree on a fair resolution procedure for our dispute, such as averaging our desires together. But even that hope was dashed. Now what are we to do?"
Yancy: "Xannon, you are overcomplicating things. 1/3 apiece. It's not that complicated. A fair distribution is an even split, not a distribution arrived at by a 'fair resolution procedure' that everyone agrees on. What if we'd all been raised in a society that believed that men should get twice as much pie as women? Then we would split the pie unevenly, and even though no one of us disputed the split, it would still be unfair."
Xannon: "What? Where is this 'fairness' stored if not in human minds? Who says that something is unfair if no intelligent agent does so? Not upon the stars or the mountains is 'fairness' written."
Yancy: "So what you're saying is that if you've got a whole society where women are chattel and men sell them like farm animals and it hasn't occurred to anyone that things could be other than they are, that this society is fair, and at the exact moment where someone first realizes it shouldn't have to be that way, the whole society suddenly becomes unfair."
Xannon: "How can a society be unfair without some specific party who claims injury and receives no reparation? If it hasn't occurred to anyone that things could work differently, and no one's asked for things to work differently, then—"
Yancy: "Then the women are still being treated like farm animals and that is unfair. Where's your common sense? Fairness is not agreement, fairness is symmetry."
Zaire: "Is this all working out to my getting half the pie?"
Yancy: "No."
Xannon: "I don't know... maybe as the limit of an infinite sequence of meta-meta-fairnesses..."
Zaire: "I fear I must accord with Yancy on one point, Xannon; your desire for perfect accord among us is misguided. I want half the pie. Yancy wants me to have a third of the pie. This is all there is to the world, and all there ever was. If two monkeys want the same banana, in the end one will have it, and the other will cry morality. Who gets to form the committee to decide the rules that will be used to determine what is 'fair'? Whoever it is, got the banana."
Yancy: "I wanted to give you a third of the pie, and you equate this to seizing the whole thing for myself? Small wonder that you don't want to acknowledge the existence of morality—you don't want to acknowledge that anyone can be so much less of a jerk."
Xannon: "You oversimplify the world, Zaire. Banana-fights occur across thousands and perhaps millions of species, in the animal kingdom. But if this were all there was, Homo sapiens would never have evolved moral intuitions. Why would the human animal evolve to cry morality, if the cry had no effect?"
Zaire: "To make themselves feel better."
Yancy: "Ha! You fail at evolutionary biology."
Xannon: "A murderer accosts a victim, in a dark alley; the murderer desires the victim to die, and the victim desires to live. Is there nothing more to the universe than their conflict? No, because if I happen along, I will side with the victim, and not with the murderer. The victim's plea crosses the gap of persons, to me; it is not locked up inside the victim's own mind. But the murderer cannot obtain my sympathy, nor incite me to help murder. Morality crosses the gap between persons; you might not see it in a conflict between two people, but you would see it in a society."
Yancy: "So you define morality as that which crosses the gap of persons?"
Xannon: "It seems to me that social arguments over disputed goals are how human moral intuitions arose, beyond the simple clash over bananas. So that is how I define the term."
Yancy: "Then I disagree. If someone wants to murder me, and the two of us are alone, then I am still in the right and they are still in the wrong, even if no one else is present."
Zaire: "And the murderer says, 'I am in the right, you are in the wrong'. So what?"
Xannon: "How does your statement that you are in the right, and the murderer is in the wrong, impinge upon the universe—if there is no one else present to be persuaded?"
Yancy: "It licenses me to resist being murdered; which I might not do, if I thought that my desire to avoid being murdered was wrong, and the murderer's desire to kill me was right. I can distinguish between things I merely want, and things that are right—though alas, I do not always live up to my own standards. The murderer is blind to the morality, perhaps, but that doesn't change the morality. And if we were both blind, the morality still would not change."
Xannon: "Blind? What is being seen, what sees it?"
Yancy: "You're trying to treat fairness as... I don't know, something like an array-mapped 2-place function that goes out and eats a list of human minds, and returns a list of what each person thinks is 'fair', and then averages it together. The problem with this isn't just that different people could have different ideas about fairness. It's not just that they could have different ideas about how to combine the results. It's that it leads to infinite recursion outright—passing the recursive buck. You want there to be some level on which everyone agrees, but at least some possible minds will disagree with any statement you make."
Xannon: "Isn't the whole point of fairness to let people agree on a division, instead of fighting over it?"
Yancy: "What is fair is one question, and whether someone else accepts that this is fair is another question. What is fair? That's easy: an equal division of the pie is fair. Anything else won't be fair no matter what kind of pretty arguments you put around it. Even if I gave Zaire a sixth of my pie, that might be a voluntary division but it wouldn't be a fair division. Let fairness be a simple and object-level procedure, instead of this infinite meta-recursion, and the buck will stop immediately."
Zaire: "If the word 'fair' simply means 'equal division' then why not just say 'equal division' instead of this strange additional word, 'fair'? You want the pie divided equally, I want half the pie for myself. That's the whole fact of the matter; this word 'fair' is merely an attempt to get more of the pie for yourself."
Xannon: "If that's the whole fact of the matter, why would anyone talk about 'fairness' in the first place, I wonder?"
Zaire: "Because they all share the same delusion."
Yancy: "A delusion of what? What is it that you are saying people think incorrectly the universe is like?"
Zaire: "I am under no obligation to describe other people's confusions."
Yancy: "If you can't dissolve their confusion, how can you be sure they're confused? But it seems clear enough to me that if the word fair is going to have any meaning at all, it has to finally add up to each of us getting one-third of the pie."
Xannon: "How odd it is to have a procedure of which we are more sure of the result than the procedure itself."
Zaire: "Speak for yourself."
Part of The Metaethics Sequence
Next post: "Moral Complexities"
Previous post: "Created Already In Motion"
I'm about to make a naked assertion with nothing to back it up, just to put it out there.
The purpose of morality is to prevent such an arguement from even ever occurring. If the morale engine of society is working correctly, then all it's members will have a desire for everyone to get an equally sized portion of the pie (in this example). If there is a Zaire who believes he should get 1/2 of the pie, then there was a malfunction when morality was being programmed into him. This malfunction will lead to conflict.
View it like you would view programming a friendly AI. The purpose is to program the AI with desires that will motivate it to help humanity, and to have a strong aversion to destroying humanity. If this goal is not reached, there was a failure by the programmers. I think it's been said on this blog that if you create an AI without having made it friendly you've already lost and the game is over. It's not quite as drastic if you fail with humans, but the principle is the same. If a friendly Human Intelligence is not programmed with the desires that will help to keep humanity thriving then there was a failure by it's programmers(parents/society/teachers/whoever).
Why is it that human morality is this confusing and mysterious realm that no one seems to be able to fathom when AI morality is straight-forward? Is it just that humans can easily see the goal of one (an AI that desires to help rather than hurt humanity) and for some reason can't see the goal of another (a human that desires to help rather than hurt humanity)?
I think it's more complex than that.
Zaire's argument is that some people actually need more of "the pie" than others. Equal portions aren't necessarily fair, in that situation.
For example: would it be fair if every person on the globe got an equal portion of diabetic insulin? No, obviously not. We disproportionately give insulin to diabetics. Because that is more fair than to distribute it equally amongst all people (regardless of their health situation).
The disagreement here is between two perfectly understandable concepts of fairness. Both of them make sense in different ways. I see no easy solution to this myself.