tl;dr: Just because it doesn't seem like we should be able to have beliefs we acknowledge to be irrational, doesn't mean we don't have them. If this happens to you, here's a tool to help conceptualize and work around that phenomenon.
There's a general feeling that by the time you've acknowledged that some belief you hold is not based on rational evidence, it has already evaporated. The very act of realizing it's not something you should believe makes it go away. If that's your experience, I applaud your well-organized mind! It's serving you well. This is exactly as it should be.
If only we were all so lucky.
Brains are sticky things. They will hang onto comfortable beliefs that don't make sense anymore, view the world through familiar filters that should have been discarded long ago, see significances and patterns and illusions even if they're known by the rest of the brain to be irrelevant. Beliefs should be formed on the basis of sound evidence. But that's not the only mechanism we have in our skulls to form them. We're equipped to come by them in other ways, too. It's been observed1 that believing contradictions is only bad because it entails believing falsehoods. If you can't get rid of one belief in a contradiction, and that's the false one, then believing a contradiction is the best you can do, because then at least you have the true belief too.
The mechanism I use to deal with this is to label my beliefs "official" and "unofficial". My official beliefs have a second-order stamp of approval. I believe them, and I believe that I should believe them. Meanwhile, the "unofficial" beliefs are those I can't get rid of, or am not motivated to try really hard to get rid of because they aren't problematic enough to be worth the trouble. They might or might not outright contradict an official belief, but regardless, I try not to act on them.
To those of you with well-ordered minds (for such lucky people seem to exist, if we believe some of the self-reports on this very site), this probably sounds outrageous. If I know they're probably not true... And I do. But they still make me expect things. They make me surprised when those expectations are flouted. If I'm asked about their subjects when tired, or not prepared for the question, they'll leap out of my mouth before I can stop them, and they won't feel like lies - because they're not. They're beliefs. I just don't like them very much.
I'll supply an example. I have a rather dreadful phobia of guns, and accordingly, I think they should be illegal. The phobia is a terrible reason to believe in the appropriateness of such a ban: said phobia doesn't even stand in for an informative real experience, since I haven't lost a family member to a stray bullet or anything of the kind. I certainly don't assent to the general proposition "anything that scares me should be illegal". I have no other reasons, except for a vague affection for a cluster of political opinions which includes something along those lines, to believe this belief. Neither the fear nor the affection are reasons I endorse for believing things in general, or this in particular. So this is an unofficial belief. Whenever I can, I avoid acting on it. Until I locate some good reasons to believe something about the topic, I officially have no opinion. I avoid putting myself in situations where I might act on the unofficial belief in the same way I might avoid a store with contents for which I have an unendorsed desire, like a candy shop. For instance, when I read about political candidates' stances on issues, I avoid whatever section talks about gun control.
Because I know my brain collects junk like this, I try to avoid making up my mind until I do have a pretty good idea of what's going on. Once I tell myself, "Okay, I've decided", I run the risk of lodging something permanently in my cortex that won't release its stranglehold on my thought process until kingdom come. I use tools like "temporarily operating under the assumption that" (some proposition) or declaring myself "unqualified to have an opinion about" (some subject). The longer I hold my opinions in a state of uncertainty, the less chance I wind up with a permanent epistemic parasite that I have to devote cognitive resources to just to keep it from making me do dumb things. This is partly because it makes the state of uncertainty come to feel like a default, which makes it simpler to slide back to uncertainty again if it seems warranted. Partly, it's because the longer I wait, the more evidence I've collected by the time I pick a side, so it's less likely that the belief I acquire is one I'll want to excise in the future.
This is all well and good as a prophylactic. It doesn't help as much with stuff that snuck in when I was but a mere slip of a youth. For that, I rely on the official/unofficial distinction, and then toe the official line as best I can in thought, word, and deed. I break in uncomfy official beliefs like new shoes. You can use your brain's love of routine to your advantage. Act like you only believe the official beliefs, and the unofficial ones will weaken from disuse. This isn't a betrayal of your "real" beliefs. The official beliefs are real too! They're real, and they're better.
1I read this in Peter van Inwagen's book "Essay on Free Will" but seem to remember that he got it elsewhere. I'm not certain where my copy has gotten to lately, so can't check.
I've already read Gendler on the subject, and it occurred to me to make the comparison, but her alief is different from my unofficial belief. Specifically, alief is the gut reaction itself and an unofficial belief would be something that is derived from such an alief (although each term has broader applications than that).
Hmm, now I'm confused. What's the difference between an alief causing you to react in ways that on reflection you reject as unjustified, versus an alief giving rise to an "unofficial belief" that has the same effects? What distinctive work is this second kind of mental state doing?