I still have no idea whether your statement is true. It requires checking. But I hope now it is clear that no part of their proof can be trusted without some editing.
If you have enough interest to try to write a claim and a proof without references to the paper, I guess it would be nice to post it as a direct comment to the post.
btw, I mentioned the work you two have been doing here to the author and tried to get him to respond here but unfortunately he hasn't agreed to.
I've recently been getting into all of this wonderful Information Theory stuff and have come across a paper (thanks to John Salvatier) that was written by Kevin H. Knuth:
Foundations of Inference
The paper sets up some intuitive minimal axioms for quantifying power sets and then (seems to) use them to derive Bayesian probability theory, information gain, and Shannon Entropy. The paper also claims to use less assumptions than both Cox and Kolmogorov when choosing axioms. This seems like a significant foundation/unification. I'd like to hear whether others agree and what parts of the paper you think are the significant contributions.
If a 14 page paper is too long for you, I recommend skipping to the conclusion (starting at the bottom of page 12) where there is a nice picture representation of the axioms and a quick summary of what they imply.