Well I personally don't want to be tortured, so I choose the dust speck.
Even if I wasn't personally involved, and I was to decide on morality alone rather than personal interest, average utilitarianism tells me that I should choose the dust speck. (Better that 100% of all people suffer from a dust speck, than 100% of all people suffer from torture)
Do you generally endorse average utilitarianism? E.g., if you can press a button to create a new world, completely isolated from all others, containing 10^10 people 10x happier than typical present-day Americans, do you press it if what currently exists is a world with 10^10 people only 9x happier than typical present-day Americans and refrain from pressing it if it's 11x instead?
You're given the option to torture everyone in the universe, or inflict a dust speck on everyone in the universe. Either you are the only one in the universe, or there are 3^^^3 perfect copies of you (far enough apart that you will never meet.) In the latter case, all copies of you are chosen, and all make the same choice. (Edit: if they choose specks, each person gets one dust speck. This was not meant to be ambiguous.)
As it happens, a perfect and truthful predictor has declared that you will choose torture iff you are alone.
What do you do?
How does your answer change if the predictor made the copies of you conditional on their prediction?
How does your answer change if, in addition to that, you're told you are the original?