More specifically, I'm pretty sure us humans don't have any negative parts of our utility function that grow exponentially with "badness," so there's no bad outcome that can overcome the exponential decrease in probability with program size to actually be a significant factor.
Are you going with Torture v Dust Specks here? Or do you just reject Many Worlds? (Or have I missed something?)
It seems to this layman that using quantum randomization would give us no increase or a tiny increase in utility per world, relative to overwriting each bit with 0 or a piece of Loren Ipsum. And as with Dust Specks, if we actually know we might have prevented torture then I'd get a warm feeling which should count towards the total.
From David Deutsch's The Beginning of Infinity:
I'm not so sure we have the computing power to "simulate a person," but suppose we did. (Perhaps we will soon.) How would you respond to this worry?