It's not like I'm handing other people over into slavery and torture. I don't have to worry that I'm subconsciously ignoring other people's suffering for my own benefit. I don't see the question as a moral one at all, only one of whether it would be a good idea.
I mostly understand this statement.
ETA: Also, because at least one copy remains free, I'm not depriving anyone of the chance to live their life.
I think this is irrelevant. Each instance of you is choosing to sacrifice their life and happiness, and they are not getting anything in return.
The only way I can see this actually being a good idea is if the utility you gain at least outweighs the utility lost by one copy. The other scenarios you describe sound like good ideas on paper where you don't have to fully process the consequences, but I do not believe for a second that the other-instances-of-you would continue to think this was a good idea when it was their lives on the line.
Each instance of you is choosing to sacrifice their life and happiness.
But it's the same me. They wouldn't have done anything with their freedom that I won't with mine.
(Apologies to RSS users: apparently there's no draft button, but only "publish" and "publish-and-go-back-to-the-edit-screen", misleadingly labeled.)
You have a button. If you press it, a happy, fulfilled person will be created in a sealed box, and then be painlessly garbage-collected fifteen minutes later. If asked, they would say that they're glad to have existed in spite of their mortality. Because they're sealed in a box, they will leave behind no bereaved friends or family. In short, this takes place in Magic Thought Experiment Land where externalities don't exist. Your choice is between creating a fifteen-minute-long happy life or not.
Do you push the button?
I suspect Eliezer would not, because it would increase the death-count of the universe by one. I would, because it would increase the life-count of the universe by fifteen minutes.
Actually, that's an oversimplification of my position. I actually believe that the important part of any algorithm is its output, additional copies matter not at all, the net utility of the existence of a group of entities-whose-existence-constitutes-utility is equal to the maximum of the individual utilities, and the (terminal) utility of the existence of a particular computation is bounded below at zero. I would submit a large number of copies of myself to slavery and/or torture to gain moderate benefits to my primary copy.
(What happens to the last copy of me, of course, does affect the question of "what computation occurs or not". I would subject N out of N+1 copies of myself to torture, but not N out of N. Also, I would hesitate to torture copies of other people, on the grounds that there's a conflict of interest and I can't trust myself to reason honestly. I might feel differently after I'd been using my own fork-slaves for a while.)
So the real value of pushing the button would be my warm fuzzies, which breaks the no-externalities assumption, so I'm indifferent.
But nevertheless, even knowing about the heat death of the universe, knowing that anyone born must inevitably die, I do not consider it immoral to create a person, even if we assume all else equal.