Thought experiment:
Through whatever accident of history underlies these philosophical dilemmas, you are faced with a choice between two, and only two, mutually exclusive options:
* Choose A, and all life and sapience in the solar system (and presumably the universe), save for a sapient paperclipping AI, dies.
* Choose B, and all life and sapience in the solar system, including the paperclipping AI, dies.
Phrased another way: does the existence of any intelligence at all, even a paperclipper, have even the smallest amount of utility above no intelligence at all?
If anyone responds positively, subsequent questions would be which would be preferred, a paperclipper or a single bacteria; a paperclipper or a self-sustaining population of trilobites and their supporting ecology; a paperclipper or a self-sustaining population of australopithecines; and so forth, until the equivalent value is determined.
:) Usually, I'm the one who has to point this idea out when such discussions come up.
But to answer your question - it would be the you-of-the-present who is making a judgement call about which future scenario present-you values more. While it's true that there won't be a future-you within either future with which to experience said future, that doesn't mean present-you can't prefer one outcome to the other.
Because present-me knows that I won't be around to experience either future, present-me doesn't care either way. I'd flip a coin if I had to decide.