I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:
One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.
"If you know it's double-think...
...how can you still believe it?" I helplessly want to say.
Or:
I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.
If you know your belief isn't correlated to reality, how can you still believe it?
Shouldn't the gut-level realization, "Oh, wait, the sky really isn't green" follow from the realization "My map that says 'the sky is green' has no reason to be correlated with the territory"?
Well... apparently not.
One part of this puzzle may be my explanation of Moore's Paradox ("It's raining, but I don't believe it is")—that people introspectively mistake positive affect attached to a quoted belief, for actual credulity.
But another part of it may just be that—contrary to the indignation I initially wanted to put forward—it's actually quite easy not to make the jump from "The map that reflects the territory would say 'X'" to actually believing "X". It takes some work to explain the ideas of minds as map-territory correspondence builders, and even then, it may take more work to get the implications on a gut level.
I realize now that when I wrote "You cannot make yourself believe the sky is green by an act of will", I wasn't just a dispassionate reporter of the existing facts. I was also trying to instill a self-fulfilling prophecy.
It may be wise to go around deliberately repeating "I can't get away with double-thinking! Deep down, I'll know it's not true! If I know my map has no reason to be correlated with the territory, that means I don't believe it!"
Because that way—if you're ever tempted to try—the thoughts "But I know this isn't really true!" and "I can't fool myself!" will always rise readily to mind; and that way, you will indeed be less likely to fool yourself successfully. You're more likely to get, on a gut level, that telling yourself X doesn't make X true: and therefore, really truly not-X.
If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.
If you keep telling yourself that deep down you'll know—
If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—
If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—
Then when push comes to shove—you may, indeed, fail.
When it comes to deliberate self-deception, you must believe in your own inability!
Tell yourself the effort is doomed—and it will be!
Is that the power of positive thinking, or the power of negative thinking? Either way, it seems like a wise precaution.
One question here obviously concerns doxastic voluntarism (DV). You ask:
"If you know your belief isn't correlated to reality, how can you still believe it?"
Is this a rhetorical question aiming to assert that if you know your belief isn't correlated to reality, you can't still believe it"?
If so, then it just isn't clear that you're right. One possibility is that DV is true (there are, of course, many reasons to believe that it is). And, if DV is true, it's likely that different people have different degrees and kinds of control over their beliefs. After all, people differ with regard to all other known cognitive skills. Some irrational folks simply might have a kind of control over their beliefs that others don't have. That's an empirical question. (Though we normally think that folks who are more rational have greater control over their beliefs.)
You might, however, mean: if you know your belief isn't correlated to reality, you shouldn't still believe it.
That's a normative claim, not an empirical, psychological one. If that's what you mean, then you're in effect expressing surprise that anyone can be that irrational. If so, I guess I'm a little surprised at your surprise. It is a fairly pure case, but it seems to me that it's not that unusual to hear things like this.