In response to a request, I am going to do some basic unpacking of second-order desire, or "metawanting". Basically, a second-order desire or metawant is a desire about a first-order desire.
Example 1: Suppose I am very sleepy, but I want to be alert. My desire to be alert is first-order. Suppose also that there is a can of Mountain Dew handy. I know that Mountain Dew contains caffeine and that caffeine will make me alert. However, I also know that I hate Mountain Dew1. I do not want the Mountain Dew, because I know it is gross. But it would be very convenient for me if I liked Mountain Dew: then I could drink it, and I could get the useful effects of the caffeine, and satisfy my desire for alertness. So I have the following instrumental belief: wanting to drink that can of Mountain Dew would let me be alert. Generally, barring other considerations, I want things that would get me other things I want - I want a job because I want money, I want money because I can use it to buy chocolate, I want chocolate because I can use it to produce pleasant taste sensations, and I just plain want pleasant taste sensations. So, because alertness is something I want, and wanting Mountain Dew would let me get it, I want to want the Mountain Dew.
This example demonstrates a case of a second-order desire about a first-order desire that would be instrumentally useful. But it's also possible to have second-order desires about first-order desires that one simply does or doesn't care to have.
Example 2: Suppose Mimi the Heroin Addict, living up to her unfortunate name, is a heroin addict. Obviously, as a heroin addict, she spends a lot of her time wanting heroin. But this desire is upsetting to her. She wants not to want heroin, and may take actions to stop herself from wanting heroin, such as going through rehab.
One thing that is often said is that what first-order desires you "endorse" on the second level are the ones that are your most true self. This seems like an appealing notion in Mimi's case; I would not want to say that at her heart she just wants heroin and that's an intrinsic, important part of her. But it's not always the case that the second-order desire is the one we most want to identify with the person who has it:
Example 3: Suppose Larry the Closet Homosexual, goodness only knows why his mother would name him that, is a closet homosexual. He has been brought up to believe that homosexuality is gross and wrong. As such, his first-order desire to exchange sexual favors with his friend Ted the Next-Door Neighbor is repulsive to him when he notices it, and he wants desperately not to have this desire.
In this case, I think we're tempted to say that poor Larry is a gay guy who's had an alien second-order desire attached to him via his upbringing, not a natural homophobe whose first-order desires are insidiously eroding his real personality.
A less depressing example to round out the set:
Example 4: Suppose Olivia the Overcoming Bias Reader, whose very prescient mother predicted she would visit this site, is convinced on by Eliezer's arguments about one-boxing in Newcomb's Problem. However, she's pretty sure that if Omega really turned up, boxes in hand, she would want to take both of them. She thinks this reflects an irrationality of hers. She wants to want to one-box.
1Carbonated beverages make my mouth hurt. I have developed a more generalized aversion to them after repeatedly trying to develop a taste for them and experiencing pain every time.
It's not that I think there's literally a math equation locked in the human psyche that encodes morality. It's more like there are multiple (sometimes conflicting) moral values and methods for resolving conflicts between them and that the sum of these can be modeled as a large and complicated equation.
You gave me the impression that Marvin valued "staying alive" less as an end in itself, and more as a means to achieve the end of improving the world. in particular when you said this:
This is actually something that bothers me in fiction when a character who is superhumanly good and power (i.e. Superman, the Doctor) risks their lives to save a relatively small amount of people. It seems short-sighted of them to do that since they regularly save much larger groups of people and anticipate continuing to do so in the future, so it seems like they should preserve their lives for those people's sakes.
If you define "the good things in life" as "whatever an entity wants the most," then you can agree, whatever someone wants is "good," be it paperclips or eudaemonia. On the other hand, I'm not sure we should do this, there are some hypothetical entities I can imagine where I can't see it as ever being good that they get what they want. For instance I can imagine a Human-Torture-Maximizer that wants to do nothing but torture human beings. It seems to me that even if there were a trillion Human-Torture-Maximizers and one human in the universe it would be bad for them to get what they want.
For more neutral, but still alien preferences, I'm less sure. It seems to me that I have a right to stop Human-Torture-Maximizers from getting what they want. But would I have the right to stop paperclippers? Making the same paperclip over and over again seems like a pointless activity to me, but if the paperclippers are willing to share part of the universe with existing humans do I have a right to stop them? I don't know, and I don't think Eliezer does either.
I think that we, and most humans, have the same basic desires, where we differ is the object of those desires, and the priority of those desires.
For instance, most people desire romantic love. But those desires usually have different objects, I desire romantic love with my girlfriend, other people desire it with their significant others. Similarly, most people desire to consume stories, but the object of that desire differs, some people like Transformers, others The Notebook.
Similarly, people often desire the same things, but differ as to their priorities, how much of those things they want. Most people desire both socializing, and quiet solitude, but some extroverts want lots of one and less of the other, while introverts are the opposite.
In the case of the paerclippers, my first instinct is to regard opposing paperclipping as no different from the many ways humans have persecuted each other for wanting different things in the past. But then it occurred to me that paperclip-maximizing might be different because most persecutions in the past involve persecuting people who have different objects and priorities, not people who actually have different desires. For instance homosexuality is the same kind of desire as heterosexuality, just with a different object (same sex instead of opposite).
Does this mean it isn't bad to oppose paperclipping? I don't know, maybe, but maybe not. Maybe we should just try to avoid creating paperclippers or similar creatures so we don't have to deal with it.
This seems like a difference in priority, rather than desire, as most people would prefer differing proportions of both both. It's still a legitimate disagreement, but I think it's more about finding a compromise between conflicting priorities, rather than totally different values.
Compounding this problem is the fact that people value diversity to some extent. We don't value all types of diversity obviously, I think we'd all like to live in a world where people held unanimous views on the unacceptability of torturing innocent people. But we would like other people to be different from us in some ways. Most people, I think, would rather live in a world full of different people with different personalities than a world consisting entirely of exact duplicates (in both personality and memory) of one person. So it might be impossible to reach full agreement on those other values without screwing up the achievement of the Value of Diversity.
I'm sorry, there's an ambiguity there - when you say "the sum of these", are you summing across the moral values and imperatives of a single person, or of humanity as a whole?
... (read more)