What is the probability that my apartment will be struck by a meteorite tomorrow? Based on the information I have, I might say something like 10-18. Now suppose I wanted to approximate that probability with a different number. Which is a better approximation: 0 or 1/2?
The answer depends on what we mean by "better," and this is a situation where epistemic (truthseeking) and instrumental (useful) rationality will disagree.
As an epistemic rationalist, I would say that 1/2 is a better approximation than 0, because the Kullback-Leibler Divergence is (about) 1 bit for the former, and infinity for the latter. This means that my expected Bayes Score drops by one bit if I use 1/2 instead of 10-18, but it drops to minus infinity if I use 0, and any probability conditional on a meteorite striking my apartment would be undefined; if a meteorite did indeed strike, I would instantly fall to the lowest layer of Bayesian hell. This is too horrible a fate to imagine, so I would have to go with a probability of 1/2.
As an instrumental rationalist, I would say that 0 is a better approximation than 1/2. Even if a meteorite does strike my apartment, I will suffer only a finite amount of harm. If I'm still alive, I won't lose all of my powers as a predictor, even if I assigned a probability of 0; I will simply rationalize some other explanation for the destruction of my apartment. Assigning a probability of 1/2 would force me to actually plan for the meteorite strike, perhaps by moving all of my stuff out of the apartment. This is a totally unreasonable price to pay, so I would have to go with a probability of 0.
I hope this can be a simple and uncontroversial example of the difference between epistemic and instrumental rationality. While the normative theory of probabilities is the same for any rationalist, the sorts of approximations a bounded rationalist would prefer can differ very much.
To the extent that this is true, perhaps the very notion of an epistemic rationalist (perhaps also of epistemic rationality) is incoherent. ("Epistemic rationality means acting so as to maximize one's accuracy." "Ah, but hidden in that word accuracy is some sort of evaluation, which you aren't allowed to have.") But it sure seems like a useful notion.
I propose that there is at least one useful notion of epistemic rationality; in fact, there's one for each viable notion of what counts as better accuracy; since real people have utility functions, calling a real person an epistemic rationalist is really shorthand for "has a utility function that highly values accuracy-in-some-particular-sense"; that one can usefully talk about epistemic rationality in general, meaning something like "things that are true about anyone who's an epistemic rationalist in any of that term's many specific senses"; and that it's at least a defensible claim that something enough like K-L divergence to make Peter's argument go through is likely to be part of any viable notion of accuracy.