Is this somehow incorrect?
All else being equal, something that takes n bits to specify has probability proportional to 2^-n. So if hypothesis A takes 110 bits and hypothesis B takes 100, then A is about 1000x less probable.
Exactly what "all else being equal" means is somewhat negotiable.
Anyway, the point is that the natural way to measure complexity is in bits, and probability varies exponentially, not linearly, with number of bits.
So if hypothesis A takes 110 bits and hypothesis B takes 100, then A is about 1000x less probable.
Yes, and hypothesis A is also 1024x as complex - since it takes ten more bits to specify.
Anyway, the point is that the natural way to measure complexity is in bits, and probability varies exponentially, not linearly, with number of bits.
...it seems that our disagreement here is in the measure of complexity, and not the measure of probability. My measure of complexity is pretty much the inverse of probability, while you're working on a log scale by measuring it in terms of a number of bits.
Another month, another rationality quotes thread. The rules are: