Sufficiently large to be larger than inverse of our precision requirements.
Are you referring to the line with "to arbitrary precision" on the bottom of page 17?
Although they don't express themselves as clearly as they could, I don't think that they mean anything like, "and hence we arrive at the exact regrading Θ in the limit by sending the number of atoms with the same valuation to infinity." Rather, I think that they mean that a larger number of atoms with the same valuations puts stronger constraints on the regrading Θ, but it is never so constrained that it can't exist.
In other words, their proof accommod...
I've recently been getting into all of this wonderful Information Theory stuff and have come across a paper (thanks to John Salvatier) that was written by Kevin H. Knuth:
Foundations of Inference
The paper sets up some intuitive minimal axioms for quantifying power sets and then (seems to) use them to derive Bayesian probability theory, information gain, and Shannon Entropy. The paper also claims to use less assumptions than both Cox and Kolmogorov when choosing axioms. This seems like a significant foundation/unification. I'd like to hear whether others agree and what parts of the paper you think are the significant contributions.
If a 14 page paper is too long for you, I recommend skipping to the conclusion (starting at the bottom of page 12) where there is a nice picture representation of the axioms and a quick summary of what they imply.