That's what Jaynes did to achieve his awesome victories: use trained intuition to pick good priors by hand on a per-sample basis.
... as if applying the classical method doesn't require using trained intuition to use the "right" method for a particular kind of problem, which amounts to choosing a prior but doing it implicitly rather than explicitly ...
Our inference is conditional on our assumptions [for example, the prior P(Lambda)]. Critics view such priors as a difficulty because they are `subjective', but I don't see how it could be otherwise. How can one perform inference without making assumptions? I believe that it is of great value that Bayesian methods force
one to make these tacit assumptions explicit.
... as if applying the classical method doesn't require using trained intuition to use the "right" method for a particular kind of problem, which amounts to choosing a prior but doing it implicitly rather than explicitly ...
McKay, information theory, learning and inference