To clarify: I mean that a sim would either be "grainier", not in any sense that would be detectable from inside, but just in the sense that it used some pseudorandom numbers as a proxy for quantum branching; or bigger in terms of stuff; or both (because there's plenty of orders of magnitude to spread between those options.
As to "well-designed nanotech" on the order of 10^20... that's vaguely plausible, but it's also plausible that that just wouldn't be able to handle the wide varieties of quantum entanglement that matter in the world we observe. Remember, even simple facts like "light travels in a straight line" are, at root, a result of quantum interference, conceivable as infinite numbers of Feynman diagrams. While it is certainly possible to create heuristics, perhaps even perfect algorithms, to reproduce any one quantum effect like that, I'm skeptical that you can just induct from there up to the quantum soup we swim in. So I'd still guess 10^(10^x) with x>=2 (note: I had said x=10 but on second thought it's probably either impossible or easier than that).
I've written a prior post about how I think that the Everett branching factor of reality dominates that of any plausible simulation, whether the latter is run on a Von Neumann machine, on a quantum machine, or on some hybrid; and thus the probability and utility weight that should be assigned to simulations in general is negligible. I also argued that the fact that we live in an apparently quantum-branching world could be construed as weak anthropic evidence for this idea. My prior post was down-modded into oblivion for reasons that are not relevant here (style, etc.) If I were to replace this text you're reading with a version of that idea which was more fully-argued, but still stylistically-neutral (unlike my prior post), would people be interested?