We in the rationalist community have believed feasible a dual allegiance to instrumental and epistemic rationality because true beliefs help with winning, but semi-autonomous near and far modes raise questions about the compatibility of the two sovereigns' jurisdictions: false far beliefs may serve to advance near interests.
First, the basics of construal-level theory. See Trope and Liberman, "Construal-Level Theory of Psychological Distance" (2010) 117 Psychological Review 440. When you look at an object in the distance and look at the same object nearby, you focus on different features. Distal information is high-level, global, central, and unchanging, whereas local information is low-level, detailed, incidental, and changing. Construal-level theorists term distal information far and local information near, and they extend these categories broadly to embrace psychological distance. Dimensions other than physical distance can be conceived as psychological distance by analogy, and these other dimensions invoke mindsets similar to those physical distance invokes.
I discuss construal-level theory in the blogs Disputed Issues and Juridical Coherence, but Robin Hanson at Overcoming Bias has been one of the theory's most prolific advocates. He gives the theory an unusual twist when he maintains that the "far" mode is largely consumed with the management of social appearances.
With this twist, Hanson effectively drives a wedge between instrumental and epistemic rationality because far beliefs may help with winning despite or even because of their falsity. Hanson doesn't shrink from the implications of instrumental rationality coupled with his version of construal-level theory. Based on research reports that the religious lead happier and more moral lives, Robin Hanson now advocates becoming religious:
Perhaps, like me, you find religious beliefs about Gods, spirits, etc. to be insufficiently supported by evidence, coherence, or simplicity to be a likely approximation to the truth. Even so, ask yourself: why care so much about truth? Yes, you probably think you care about believing truth – but isn’t it more plausible that you mainly care about thinking you like truth? Doesn’t that have a more plausible evolutionary origin than actually caring about far truth? ("What Use Far Truth?")
Instrumental rationalists could practice strict epistemic rationality if they define winning as gaining true belief, but though no doubt a dedicated intellectual, even Hanson doesn't value truth that much, at least not "far" truth. Yet, how many rationalists have cut their teeth on the irrationality of religion? How many have replied to religious propaganda about the benefits of religion with disdain for invoking mere prudential benefit where truth is at stake? As an ideal, epistemic rationality, it seems to me, fares better than instrumental rationality.
There should always come a point at which epistemic rationality gives way to instrumental rationality.
Consider: Omega appears and tells you that unless you cease believing in the existence of red pandas, it will destroy everything you value in the world.
Now suppose Omega has a good track record of doing this, and it turns out for whatever reason that it wouldn't be too hard to stop believing in red pandas. Then given how inconsequential your belief in red pandas probably is, it seems that you ought to stop believing in them.
This is a trivial example, but it should illustrate the point: if the stakes get high enough, it may be worthwhile sacrificing epistemic rationality to a greater or lesser degree.
I agree that the conclusion follows from the premises, but nonetheless it's hypothetical scenarios like this which cause people to distrust hypothetical scenarios. There is no Omega, and you can't magically stop believing in red pandas; when people rationalize the utility of known falsehoods, what happens in their mind is complicated, divorces endorsement from modeling, and bears no resemblance to what they believe they're doing to themselves. Anti-epistemology is a huge actual danger of actual life,