In this example, Joe's belief that he's smart and beautiful does pay rent in anticipated experience. He anticipates a favorable reaction if he approaches a girl with his gimmick and pickup line. As it happens, his innaccurate beliefs are paying rent in inaccurate anticipated experiences, and he goes wrong epistemically by not noticing that his actual experience differs from his anticipated experience and he should update his beliefs accordingly.
The virtue of making beliefs pay rent in anticipated experience protects you from forming incoherent beleifs, maps not corresponding to any territory. Joe's beliefs are coherent, correspond to a part of the territory, and are persistantly wrong.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Well, he might. Or, rather, there might be available ways of becoming smarter or prettier for which jettisoning his false beliefs is a necessary precondition.
But, admittedly, he might not.
Anyway, sure, if Joe "terminally" values his beliefs about the world, then he gets just as much utility out of operating within a VR simulation of his beliefs as out of operating in the world. Or more, if his beliefs turn out to be inconsistent with the world.
That said, I don't actually know anyone for whom this is true.
I don't know too many theist janitors, either. Doesn't mean they don't exist.
From my perspective, it sucks to be them. But once you're them, all you can do is minimize your misery by finding some local utility maximum and staying there.