This is firmly in the realm of wild speculation and/or science fiction plot ideas. That said -
You're right that it does not follow from UDT alone. I do think it follows from a combination of UDT with many common types of utility functions; in particular, if utility is discounted exponentially with time, or if the sims must halt and being halted is sufficiently bad.
A lot depends on what happens subjectively after the simulation is halted, or if there are sufficient resources to keep it going indefinitely. In the latter case, most simulated bad things can be easily made up for by altering the simulated universe after the useful data has been extracted. This would imply that if you are living in a sim created by your future self, your future self follows UDT, and your future self has sufficient resources, you'll end up in a heaven of some sort. Actually, if you ever did gain the ability to run long-duration simulations of your past selves, then it seems like UDT implies you should run rescue sims of yourself.
Simulations that have to halt after a short duration are very problematic, though; if you anticipate a long life ahead of you, then your past selves probably also have a long life ahead of them, too, which would be cut short by a sim that has to halt. This would probably outweigh the benefits of any information gleaned.
What must a sane person1 think regarding religion? The naive first approximation is "religion is crap". But let's consider the following:
Humans are imperfectly rational creatures. Our faults include not being psychologically able to maximally operate according to our values. We can e.g. suffer from burn-out if we try to push ourselves too hard.
It is thus important for us to consider, what psychological habits and choices contribute to our being able to work as diligently for our values as we want to (while being mentally healthy). It is a theoretical possibility, a hypothesis that could be experimentally studied, that the optimal2 psychological choices include embracing some form of Faith, i.e. beliefs not resting on logical proof or material evidence.
In other words, it could be that our values mean that Occam's Razor should be rejected (in some cases), since embracing Occam's Razor might mean that we miss out on opportunities to manipulate ourselves psychologically into being more what we want to be.
To a person aware of The Simulation Argument, the above suggests interesting corollaries:
1: Actually, what I've written here assumes we are talking about humans. Persons-in-general may be psychologically different, and theoretically capable of perfect rationality.
2: At least for some individuals, not necessarily all.