I don't think it's possible to give answers to all ethical dilemmas in such a way as to be consistent and reasonable across the board, but here my intuition is that if a mind only lasts 15 minutes, and it has no influence on the outside world and leaves no 'thought children' (e.g. doodles, poems, theorems) behind after its death, then whether it experiences contentment or agony has no moral value whatsoever. Its contentment, its agony, its creation and its destruction are all utterly insignificant and devoid of ethical weight.
To create a mind purely to torture it for 15 minutes is something only an evil person would want to do (just as only an evil person would watch videos of torture for fun) but as an act, it's a mere 'symptom' of the fact that all is not well in the universe.
(However, if you were to ask "what if the person lasted 30 minutes? A week? A year? etc." then at some point I'd have to change my answer, and it might be difficult to reconcile both answers. But again, I don't believe that the 'sheaf' of human moral intuitions has a 'global section'.)
the net utility of the existence of a group of entities-whose-existence-constitutes-utility is equal to the maximum of the individual utilities
Hmm. There might be a good insight lurking around there, but I'd want to argue that (a) such entities may include 'pieces of knowledge', 'trains of thought', 'works of art', 'great cities' etc rather than just 'people'. And (b), the 'utilities' (clearer to just say 'values') of these things might be partially rather than linearly ordered, so that the 'maximum' becomes a 'join', which may not be attained by any of them individually. (Is the best city better or worse than the best symphony, and are they better or worse than Wiles' proof of Fermat's Last Theorem, and are they better or worse than a giraffe?)
Your point about partial ordering is very powerfully appealing.
However, I feel that any increase in utility from mere accumulation tends strongly to be completely overridden by increase in utility from increasing the quality of the best thing you have, such as by synthesizing a symphony and a theorem together into some deeper, polymathic insight. There might be edge cases where a large increase in quantity outweighs a small increase in quality, but I haven't thought of any yet.
(Incidentally, I just noticed that I've been using terms incorrectly a...
(Apologies to RSS users: apparently there's no draft button, but only "publish" and "publish-and-go-back-to-the-edit-screen", misleadingly labeled.)
You have a button. If you press it, a happy, fulfilled person will be created in a sealed box, and then be painlessly garbage-collected fifteen minutes later. If asked, they would say that they're glad to have existed in spite of their mortality. Because they're sealed in a box, they will leave behind no bereaved friends or family. In short, this takes place in Magic Thought Experiment Land where externalities don't exist. Your choice is between creating a fifteen-minute-long happy life or not.
Do you push the button?
I suspect Eliezer would not, because it would increase the death-count of the universe by one. I would, because it would increase the life-count of the universe by fifteen minutes.
Actually, that's an oversimplification of my position. I actually believe that the important part of any algorithm is its output, additional copies matter not at all, the net utility of the existence of a group of entities-whose-existence-constitutes-utility is equal to the maximum of the individual utilities, and the (terminal) utility of the existence of a particular computation is bounded below at zero. I would submit a large number of copies of myself to slavery and/or torture to gain moderate benefits to my primary copy.
(What happens to the last copy of me, of course, does affect the question of "what computation occurs or not". I would subject N out of N+1 copies of myself to torture, but not N out of N. Also, I would hesitate to torture copies of other people, on the grounds that there's a conflict of interest and I can't trust myself to reason honestly. I might feel differently after I'd been using my own fork-slaves for a while.)
So the real value of pushing the button would be my warm fuzzies, which breaks the no-externalities assumption, so I'm indifferent.
But nevertheless, even knowing about the heat death of the universe, knowing that anyone born must inevitably die, I do not consider it immoral to create a person, even if we assume all else equal.