Let's suppose you're trying to figure out if something is going to be A or B, but you have no clue.
What probability should you assign to each option?
Some people might say that you should assign 50/50. After all, the argument goes, if you assigned 40/60 or 60/40, well it sure sounds like you're favoring one option over another.
That's a very persuasive argument, but unfortunately, it's more complicated than that.
It may be the case that A splits into two options A1 and A2 where again we have no clue whether A1 is more likely than A2 or how these compare to B. In which case, the same argument would suggest that we should go with 33/33/33 (rounding down).
This seems to be a contradiction. What should we make of this?
First of all, I think we should accept that this exact process of reasoning leads to a contradiction. There's nothing fancy going on here. No dubious steps that could give us an out.
We tried to say that if we had no clue that the logical thing to do is to assign equal probability to each option, we forgot that we were implicitly assuming that we should favor our particular way of carving up probability space.
In other words, we did need a clue after all. So what kind of clue is this exactly?
Well, surely in order to be justified in carving up the space a particular way, we'd need to have a reason to believe that each possibility is equally likely a priori.
Sadly, this is circular. We wanted to defend equal probabilities by asserting that our way of carving up the space was reasonable, but then we tried to assert that it was reasonable by claiming that each possibility had equal probability.
The problem is that we are making an assumption, but rather than owning it, we're trying to deny that we're making any assumption at all, ie. "I'm not assuming a priori A and B have equal probability based on my subjective judgement, I'm using the principle of indifference". Roll to disbelieve.
Contra Descartes, if you start with nothing, you can't get anywhere.
So the one we know the most about gets the heavier cumulative weight, because it has more sub-classes in our reasoning, because we know the most about it.
I suspect that without a solid base case, this process of over-weighting familiar options could be weaponized to form an argument against pursuing novel ideas about familiar topics. If I have "no clue" about, say, how a new model of the universe would compare to the old models, I could split "possibilities from old models" into so many fragments that the probability of the new model, weighted equally to all the tiny slivers of the familiar old model, approaches 0.
It seems like preventing this may require that the things being compared seem of similar size, but if you can guess the size of something, you're no longer entirely clueless about it.
It seems like the problem might be in the assumption of having "no clue". I think someone truly clueless about a question would be unable to divide it into parts in order to compare the parts' probabilities. I imagine being asked an advanced question about the grammar or meaning of a passage in a language that I can neither speak nor read, and I would be unable to even formulate a question to assign probabilities to.