Thought experiment:
Through whatever accident of history underlies these philosophical dilemmas, you are faced with a choice between two, and only two, mutually exclusive options:
* Choose A, and all life and sapience in the solar system (and presumably the universe), save for a sapient paperclipping AI, dies.
* Choose B, and all life and sapience in the solar system, including the paperclipping AI, dies.
Phrased another way: does the existence of any intelligence at all, even a paperclipper, have even the smallest amount of utility above no intelligence at all?
If anyone responds positively, subsequent questions would be which would be preferred, a paperclipper or a single bacteria; a paperclipper or a self-sustaining population of trilobites and their supporting ecology; a paperclipper or a self-sustaining population of australopithecines; and so forth, until the equivalent value is determined.
Choice B, on the grounds that a paperclipper is likely to prevent life as we know it from rising again through whatever mechanism it rose the first time.
For the slightly different case in which life both dies and is guaranteed not to rise naturally ever again, choice A. There's a small but finite chance of the paperclipper slipping enough bits to produce something worthwhile, like life. This is probably less likely than whatever jumpstarted life on Earth happening again.
For the again slightly different case in which life dies and is guaranteed not to rise again through any means including the actions of the paperclipper, back to choice B. There are cool things in the universe that would be made less cool by turning them into paperclips.
So you think that majestic paperclip engineering cannot be cool? (Only regarding your last paragraph.)