What is the probability that my apartment will be struck by a meteorite tomorrow? Based on the information I have, I might say something like 10-18. Now suppose I wanted to approximate that probability with a different number. Which is a better approximation: 0 or 1/2?
The answer depends on what we mean by "better," and this is a situation where epistemic (truthseeking) and instrumental (useful) rationality will disagree.
As an epistemic rationalist, I would say that 1/2 is a better approximation than 0, because the Kullback-Leibler Divergence is (about) 1 bit for the former, and infinity for the latter. This means that my expected Bayes Score drops by one bit if I use 1/2 instead of 10-18, but it drops to minus infinity if I use 0, and any probability conditional on a meteorite striking my apartment would be undefined; if a meteorite did indeed strike, I would instantly fall to the lowest layer of Bayesian hell. This is too horrible a fate to imagine, so I would have to go with a probability of 1/2.
As an instrumental rationalist, I would say that 0 is a better approximation than 1/2. Even if a meteorite does strike my apartment, I will suffer only a finite amount of harm. If I'm still alive, I won't lose all of my powers as a predictor, even if I assigned a probability of 0; I will simply rationalize some other explanation for the destruction of my apartment. Assigning a probability of 1/2 would force me to actually plan for the meteorite strike, perhaps by moving all of my stuff out of the apartment. This is a totally unreasonable price to pay, so I would have to go with a probability of 0.
I hope this can be a simple and uncontroversial example of the difference between epistemic and instrumental rationality. While the normative theory of probabilities is the same for any rationalist, the sorts of approximations a bounded rationalist would prefer can differ very much.
So what lesson does a rationalist draw from this? What is best for the Bayesian mathematical model is not best in practice? Conserving information is not always "good"?
Also,
This seems distinctly contrary to what an instrumental rationalist would do. It seems more likely he'd say "I was wrong, there was actually an infinitesimal probability of a meteorite strike that I previously ignored because of incomplete information/negligence/a rounding error."