(Apologies to RSS users: apparently there's no draft button, but only "publish" and "publish-and-go-back-to-the-edit-screen", misleadingly labeled.)
You have a button. If you press it, a happy, fulfilled person will be created in a sealed box, and then be painlessly garbage-collected fifteen minutes later. If asked, they would say that they're glad to have existed in spite of their mortality. Because they're sealed in a box, they will leave behind no bereaved friends or family. In short, this takes place in Magic Thought Experiment Land where externalities don't exist. Your choice is between creating a fifteen-minute-long happy life or not.
Do you push the button?
I suspect Eliezer would not, because it would increase the death-count of the universe by one. I would, because it would increase the life-count of the universe by fifteen minutes.
Actually, that's an oversimplification of my position. I actually believe that the important part of any algorithm is its output, additional copies matter not at all, the net utility of the existence of a group of entities-whose-existence-constitutes-utility is equal to the maximum of the individual utilities, and the (terminal) utility of the existence of a particular computation is bounded below at zero. I would submit a large number of copies of myself to slavery and/or torture to gain moderate benefits to my primary copy.
(What happens to the last copy of me, of course, does affect the question of "what computation occurs or not". I would subject N out of N+1 copies of myself to torture, but not N out of N. Also, I would hesitate to torture copies of other people, on the grounds that there's a conflict of interest and I can't trust myself to reason honestly. I might feel differently after I'd been using my own fork-slaves for a while.)
So the real value of pushing the button would be my warm fuzzies, which breaks the no-externalities assumption, so I'm indifferent.
But nevertheless, even knowing about the heat death of the universe, knowing that anyone born must inevitably die, I do not consider it immoral to create a person, even if we assume all else equal.
To my mind the issue with copies is that it's copies who remain exactly the same that "don't matter", whereas once you've got a bunch of copies being tortured, they're no longer identical copies and so are different people. Maybe I'm just having trouble with Sleeping Beauty-like problems, but that's only a subjective issue for decision making (plus I'd rather spend time learning interesting things that won't require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.
Where "Copies" is used isomorphically with "Future versions of you in either MWI or similar realist interpretation of probability theory", then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. "I" don't like torture, and I'd need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I'm at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven't tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I'd prefer saving two (assuming they're isolated ie two guys on a lifeboat).
tl; dr, it's a terrible idea in that if you only have the moral authority to condemn copies
Is your last sentence missing something? It feels incomplete.