What must a sane person1 think regarding religion? The naive first approximation is "religion is crap". But let's consider the following:
Humans are imperfectly rational creatures. Our faults include not being psychologically able to maximally operate according to our values. We can e.g. suffer from burn-out if we try to push ourselves too hard.
It is thus important for us to consider, what psychological habits and choices contribute to our being able to work as diligently for our values as we want to (while being mentally healthy). It is a theoretical possibility, a hypothesis that could be experimentally studied, that the optimal2 psychological choices include embracing some form of Faith, i.e. beliefs not resting on logical proof or material evidence.
In other words, it could be that our values mean that Occam's Razor should be rejected (in some cases), since embracing Occam's Razor might mean that we miss out on opportunities to manipulate ourselves psychologically into being more what we want to be.
To a person aware of The Simulation Argument, the above suggests interesting corollaries:
- Running ancestor simulations is the ultimate tool to find out, what (if any) form of Faith is most conducive to us being able to live according to our values.
- If there is a Creator and we are in fact currently in a simulation being run by that Creator, it would have been rather humorous of them to create our world thus that the above method would yield "knowledge" of their existence.
1: Actually, what I've written here assumes we are talking about humans. Persons-in-general may be psychologically different, and theoretically capable of perfect rationality.
2: At least for some individuals, not necessarily all.
I don't think simulations help. Once you start simulating yourself to arbitrary precision, that being would have the same thoughts as you, including "Hey, I should run a simulation", and then you're back to square one.
More generally, when you think about how to interact with other people, you are simulating them, in a crude sense, using your own mind as a shortcut. See empathic inference.
If you become superintelligent and have lots more computing resources, then your simulations of other minds themselves become minds, with experiences indistinguishable from yours, and make the same decisions, for the same reasons. What's worse, the simulations have the same moral weight! See EY's nonperson predicates.
(This has inspired me to consider "Virtualization Decision Theory", VDT, which says, "Act as though setting the output of yourself in a simulation run by beings deciding how to interact with a realer version of you that you care about more.")
Here are my earlier remarks on the simulated-world / religion parallel.
Simulations might be of limited utility (given limited computational resources), but they'd certainly help.
Without simulations, it's very difficult to run complex experiments of how an entity behaves in a series of situations, with the only changing variable being the entity's initial beliefs.