It’s the year 2045, and Dr. Evil and the Singularity Institute have been in a long and grueling race to be the first to achieve machine intelligence, thereby controlling the course of the Singularity and the fate of the universe. Unfortunately for Dr. Evil, SIAI is ahead in the game. Its Friendly AI is undergoing final testing, and Coherent Extrapolated Volition is scheduled to begin in a week. Dr. Evil learns of this news, but there’s not much he can do, or so it seems. He has succeeded in developing brain scanning and emulation technology, but the emulation speed is still way too slow to be competitive.
There is no way to catch up with SIAI's superior technology in time, but Dr. Evil suddenly realizes that maybe he doesn’t have to. CEV is supposed to give equal weighting to all of humanity, and surely uploads count as human. If he had enough storage space, he could simply upload himself, and then make a trillion copies of the upload. The rest of humanity would end up with less than 1% weight in CEV. Not perfect, but he could live with that. Unfortunately he only has enough storage for a few hundred uploads. What to do…
Ah ha, compression! A trillion identical copies of an object would compress down to be only a little bit larger than one copy. But would CEV count compressed identical copies to be separate individuals? Maybe, maybe not. To be sure, Dr. Evil gives each copy a unique experience before adding it to the giant compressed archive. Since they still share almost all of the same information, a trillion copies, after compression, just manages to fit inside the available space.
Now Dr. Evil sits back and relaxes. Come next week, the Singularity Institute and rest of humanity are in for a rather rude surprise!
That's not a heuristic in the sense I use the word in the comment above, it's (rather weakly) descriptive of a goal and not rules for achieving it.
The main argument (and I changed my mind on this recently) is the same as for why another normal human's preference isn't that bad: sympathy. If human preference has a component of sympathy, of caring about other human-like persons' preferences, then there is always a sizable slice of the control of the universe pie going to everyone's preference, even if orders of magnitude smaller than for the preference in control. I don't expect that even the most twisted human can have a whole aspect of preference completely absent, even if manifested to smaller degree than usual.
This apparently changes my position on the danger of value drift, and modifying minds of uploads in particular. Even though we will lose preference to the value drift, we won't lose it completely, so long as people holding the original preference persist.
Humans also have other preferences that are in conflict with sympathy, for example the desire to see one's enemies suffer. If sympathy is manifested to a sufficiently small degree, then it won't be enough to override those other preferences.
Are you aware of what has been happening in Congo, for example?