Whenever we talk about the probability of an event that we do not have perfect information about, we generally use qualitative descriptions (e.g. possible but improbable). When we do use numbers, we usually just stick to a probability range (e.g. 1/4 to 1/3). A Bayesian should be able to assign a probability estimate to any well-defined hypothesis. For a human, trying to assign a numerical probability estimate is uncomfortable and seems arbitrary. Even when we can give a probability range, we resist averaging the probabilities we expect. For instance, I'd say that Republicans are more likely than not to take over the House, but the Democrats still have a chance. After pressing myself, I managed to say that the probability of the Democratic party keeping control of the House next election is somewhere between 25% and 40%. Condensing this to 32.5% just feels wrong and arbitrary. Why is this? I have thought of three possible reasons, which I listed in order of likeliness:
Maybe our brains are just built like frequentists. If we innately think of probability of probabilities of being properties of hypotheses, it makes sense that we would not give an exact probability. If this is correct, it would mean that the tendency to think in frequentist terms is too entrenched to be easily untrained, as I try to think like a Bayesian, and yet still suffer from this effect, and I suspect the same is true of most of you.
Maybe since our ancestors never needed to express numerical probabilities, our brains never developed the ability to. Even if we have data spaces in our brains to represent probabilities of hypotheses, it could be buried in the decision-making portion of our brains, and the signal could get garbled when we try to pull it out in verbal form. However, we also get uncomfortable when forced to make important decisions on limited information, which would be evidence against this.
Maybe there is selection pressure against giving specific answers because it makes it harder to inflate your accuracy after the fact, resulting in lower status. This seems highly unlikely, but since I thought of it, I felt compelled to write it anyway.
As there are people on this forum who actually know a thing or two about cognitive science, I expect I'll get some useful responses. Discuss.
Edit: I did not mean to imply that it is wise for humans to give a precise probability estimate, only that a Bayesian would but we don't.
The answer is that human minds are not Bayesian, nor is it possible for them to become such. For just about any interesting question you may ask, the algorithm that your brain uses to find the answer is not transparent to your consciousness -- and its output doesn't include a numerical probability estimate, merely a vague and coarsely graded feeling of certainty. The only exceptions are situations where a phenomenon can be modeled mathematically in a way that allows you to work through the probability calculations explicitly, but even then, your confidence that the model captures reality ultimately comes down to a common-sense judgment produced by your non-transparent brain circuits.
In your concrete example, if you're knowledgeable about politics, you can have a good hunch for how likely a certain future election outcome is. But this insight is produced by a mostly opaque process in your brain, which doesn't give you any numerical probabilities. This is not a problem you can attack with an explicit mathematical calculation, and even if you devised a way to do so, the output of this calculation would be altogether different from the conclusion you'll make using common sense, and it makes no sense to assign the probability calculated by the former to the latter.
Therefore, insisting on attaching a numerical probability to your common-sense conclusions makes no sense, except insofar as such numbers are sometimes used as vague figures of speech.
But attaching those estimates is clearly useful.
Consider training: predictionbook.com