the assumption that our experiences should be more-or-less ordinary
How do you know what to call "ordinary"? If you think you're being simulated, then you need to predict what kinds and amounts of simulations exist besides the one you're in, as well as how extensive and precise your own simulation is in past time and space, not just in its future.
And this is designed to escape the DA; it's the only reason to think you are simulated in the first place.
There are lots of reasons other than the DA to think we're being simulated: e.g. Bostrom's Simulation Argument (posthumans are likely to run ancestor simulations). The DA is a very weak argument for simulation: it is equally consistent with there being an extinction event in our future.
If you think you're being simulated, then you need to predict what kinds and amounts of simulations exist besides the one you're in, as well as how extensive and precise your own simulation is in past time and space, not just in its future.
I don't see why simulated observers would almost ever outnumber physical observers. It would need an incredibly inefficient allocation of resources.
There are lots of reasons other than the DA to think we're being simulated: e.g. Bostrom's Simulation Argument (posthumans are likely to run ancestor simulations).
Avoi...
A self-modifying AI is built to serve humanity. The builders know, of course, that this is much riskier than it seems, because its success would render their own observations extremely rare. To solve the problem, they direct the AI to create billions of simulated humanities in the hope that this will serve as a Schelling point to them, and make their own universe almost certainly simulated.
Plausible?