(Apologies to RSS users: apparently there's no draft button, but only "publish" and "publish-and-go-back-to-the-edit-screen", misleadingly labeled.)
You have a button. If you press it, a happy, fulfilled person will be created in a sealed box, and then be painlessly garbage-collected fifteen minutes later. If asked, they would say that they're glad to have existed in spite of their mortality. Because they're sealed in a box, they will leave behind no bereaved friends or family. In short, this takes place in Magic Thought Experiment Land where externalities don't exist. Your choice is between creating a fifteen-minute-long happy life or not.
Do you push the button?
I suspect Eliezer would not, because it would increase the death-count of the universe by one. I would, because it would increase the life-count of the universe by fifteen minutes.
Actually, that's an oversimplification of my position. I actually believe that the important part of any algorithm is its output, additional copies matter not at all, the net utility of the existence of a group of entities-whose-existence-constitutes-utility is equal to the maximum of the individual utilities, and the (terminal) utility of the existence of a particular computation is bounded below at zero. I would submit a large number of copies of myself to slavery and/or torture to gain moderate benefits to my primary copy.
(What happens to the last copy of me, of course, does affect the question of "what computation occurs or not". I would subject N out of N+1 copies of myself to torture, but not N out of N. Also, I would hesitate to torture copies of other people, on the grounds that there's a conflict of interest and I can't trust myself to reason honestly. I might feel differently after I'd been using my own fork-slaves for a while.)
So the real value of pushing the button would be my warm fuzzies, which breaks the no-externalities assumption, so I'm indifferent.
But nevertheless, even knowing about the heat death of the universe, knowing that anyone born must inevitably die, I do not consider it immoral to create a person, even if we assume all else equal.
Um.
I think I agree with you, but I'm not sure, and I'm not sure if the problem is language or that I'm just really confused.
For the sake of clarity, let's consider a specific hypothetical: Sam is given a button which, if pressed, Sam believes will do two things. First, it will cause there to be two identical-at-the-moment-of-pressing copies of Sam. Second, it will cause one of the copies (call it Sam-X) to suffer a penalty P, and the other copy (call it Sam-Y) to receive a benefit B.
If I've understood you correctly, you would say that for Sam to press that button is an ethical choice, though it might not be a wise choice, depending on the value of (B-P).
Yes?
The relevant formula might be something other than (B-P), depending on Sam's utility function, but otherwise that's essentially what I believe.