cross-posted on the EA Forum
I'm interested in questions of the form, "I have a bit of metadata/structure to the question, but I know very little about the content of the question (or alternatively, I'm too worried about biases/hacks to how I think about the problem or what pieces of information to pay attention to). In those situations, what prior should I start with?"
I'm not sure if there is a more technical term than "low-information prior."
Some examples of what I found useful recently:
1. Laplace's Rule of Succession, for when the underlying mechanism is unknown.
2. Percentage of binary questions that resolves as "yes" on Metaculus. It turns out that of all binary (Yes-No) questions asked on the prediction platform Metaculus, 29% of them resolved yes. This means that even if you know nothing about the content of a Metaculus question, a reasonable starting point for answering a randomly selected binary Metaculus question is 29%.
In both cases, obviously there are reasons to override the prior (for example, you can arbitrarily flip all questions on Metaculus such that your prior is now 71%). However (I claim), having a decent prior is nonetheless useful in practice, even if it's theoretically unprincipled.
I'd be interested in seeing something like 5-10 examples of low-information priors as useful as the rule of succession or the Metaculus binary prior.
One idea that comes to mind is that the surface-level information sources (e.g. news articles) are often *'correct' *on a basic level, but really more like 'yes, but it's complicated' on a deeper level.
The best illustration of this is if you've ever seen a surface-level description of something you know about at a deep level, and you realise how wrong it is, or at least how much nuance it's missing. The next step is to realise that it's like that with everything - i.e. all the things you're not an expert on.