tl;dr: Just because it doesn't seem like we should be able to have beliefs we acknowledge to be irrational, doesn't mean we don't have them.  If this happens to you, here's a tool to help conceptualize and work around that phenomenon.

There's a general feeling that by the time you've acknowledged that some belief you hold is not based on rational evidence, it has already evaporated.  The very act of realizing it's not something you should believe makes it go away.  If that's your experience, I applaud your well-organized mind!  It's serving you well.  This is exactly as it should be.

If only we were all so lucky.

Brains are sticky things.  They will hang onto comfortable beliefs that don't make sense anymore, view the world through familiar filters that should have been discarded long ago, see significances and patterns and illusions even if they're known by the rest of the brain to be irrelevant.  Beliefs should be formed on the basis of sound evidence.  But that's not the only mechanism we have in our skulls to form them.  We're equipped to come by them in other ways, too.  It's been observed1 that believing contradictions is only bad because it entails believing falsehoods.  If you can't get rid of one belief in a contradiction, and that's the false one, then believing a contradiction is the best you can do, because then at least you have the true belief too.

The mechanism I use to deal with this is to label my beliefs "official" and "unofficial".  My official beliefs have a second-order stamp of approval.  I believe them, and I believe that I should believe them.  Meanwhile, the "unofficial" beliefs are those I can't get rid of, or am not motivated to try really hard to get rid of because they aren't problematic enough to be worth the trouble.  They might or might not outright contradict an official belief, but regardless, I try not to act on them.

To those of you with well-ordered minds (for such lucky people seem to exist, if we believe some of the self-reports on this very site), this probably sounds outrageous.  If I know they're probably not true... And I do.  But they still make me expect things.  They make me surprised when those expectations are flouted.  If I'm asked about their subjects when tired, or not prepared for the question, they'll leap out of my mouth before I can stop them, and they won't feel like lies - because they're not.  They're beliefs.  I just don't like them very much.

I'll supply an example.  I have a rather dreadful phobia of guns, and accordingly, I think they should be illegal.  The phobia is a terrible reason to believe in the appropriateness of such a ban: said phobia doesn't even stand in for an informative real experience, since I haven't lost a family member to a stray bullet or anything of the kind.  I certainly don't assent to the general proposition "anything that scares me should be illegal".  I have no other reasons, except for a vague affection for a cluster of political opinions which includes something along those lines, to believe this belief.  Neither the fear nor the affection are reasons I endorse for believing things in general, or this in particular.  So this is an unofficial belief.  Whenever I can, I avoid acting on it.  Until I locate some good reasons to believe something about the topic, I officially have no opinion.  I avoid putting myself in situations where I might act on the unofficial belief in the same way I might avoid a store with contents for which I have an unendorsed desire, like a candy shop.  For instance, when I read about political candidates' stances on issues, I avoid whatever section talks about gun control.

Because I know my brain collects junk like this, I try to avoid making up my mind until I do have a pretty good idea of what's going on.  Once I tell myself, "Okay, I've decided", I run the risk of lodging something permanently in my cortex that won't release its stranglehold on my thought process until kingdom come.  I use tools like "temporarily operating under the assumption that" (some proposition) or declaring myself "unqualified to have an opinion about" (some subject).  The longer I hold my opinions in a state of uncertainty, the less chance I wind up with a permanent epistemic parasite that I have to devote cognitive resources to just to keep it from making me do dumb things.  This is partly because it makes the state of uncertainty come to feel like a default, which makes it simpler to slide back to uncertainty again if it seems warranted.  Partly, it's because the longer I wait, the more evidence I've collected by the time I pick a side, so it's less likely that the belief I acquire is one I'll want to excise in the future.

This is all well and good as a prophylactic.  It doesn't help as much with stuff that snuck in when I was but a mere slip of a youth.  For that, I rely on the official/unofficial distinction, and then toe the official line as best I can in thought, word, and deed.  I break in uncomfy official beliefs like new shoes.  You can use your brain's love of routine to your advantage.  Act like you only believe the official beliefs, and the unofficial ones will weaken from disuse.  This isn't a betrayal of your "real" beliefs.  The official beliefs are real too!  They're real, and they're better.

 

1I read this in Peter van Inwagen's book "Essay on Free Will" but seem to remember that he got it elsewhere.  I'm not certain where my copy has gotten to lately, so can't check.

New to LessWrong?

New Comment
44 comments, sorted by Click to highlight new comments since: Today at 3:49 PM

Parent voted back up to zero, because people shouldn't actually lose karma for comments like that, because some worded praise is human and good, and because it won't contribute to noise significantly, because insightful and informative type comments will get voted higher and will thus appear before it.

Compliments like that are author-fuel.

Now I'm regretting not posting my compliments to Alicorn immediately after reading the post. Incidentally, your recent fanfic about the girl whose friends have good reason to fear she will destroy the world was fantastic: the ending was a perfectly constructed Crowning Moment of Awesome, and you stopped it in exactly the right place (at least for someone who isn't familiar with or attached to the characters, like me).

Was that a spoiler? If so, please ROT13.

That's actually a tricky question -- it's maybe a minor spoiler if you're not familiar with the source material (as I wasn't), but those familiar with the source material know that it's one of three perspectives on the subject. Anyway, instead of ROT13, I've just obscured it a bit such that nothing at all should be spoiled by my current phrasing.

Title/link?

I lost the link, but I found it again: Trust in God, or, The Riddle of Kyon.

Good article! I like the "official/unofficial" terminology.

I suspect a lot of those "unofficial beliefs" are about ourselves, how people judge us, what we can do, what is ok, what is offensive/insulting, etc. - at least, most instances I can think of were of that kind. "I [i]know[/i] so-and-so didn't deliberately act that way to annoy me, but I'm still as angry as if he did.", "I [i]know[/i] that girl isn't going to laugh at me, but I'm as nervous as if she was", etc.

This ties in with Akrasia ("I know writing my thesis is more important than watching falling goats on YouTube, but my behaviour shows otherwise"), and self-delusion (If It' trying to make myself believe I have high-status, isn't it like trying to make it an official belief even though part of us knows it's not true?).

Anecdotal: I used to be afraid of guns (and also subscribe to the political spectrum that was afraid of guns), but frequent exposure to guns has changed this. I don't think having a healthy respect for something so dangerous is bad at all, but my brain used to just shut down at the word "gun" and I couldn't be at all neutral about it. Now I think they're kind of cool. If you are working at all on desensitizing your phobia, that might be an interesting post (although I realize this series is officially complete).

Alicorn -- you should check out Gendler's distinction between 'alief' and 'belief'.

I've already read Gendler on the subject, and it occurred to me to make the comparison, but her alief is different from my unofficial belief. Specifically, alief is the gut reaction itself and an unofficial belief would be something that is derived from such an alief (although each term has broader applications than that).

Hmm, now I'm confused. What's the difference between an alief causing you to react in ways that on reflection you reject as unjustified, versus an alief giving rise to an "unofficial belief" that has the same effects? What distinctive work is this second kind of mental state doing?

I'll try to find time to re-read her article and clarify for you further.

Without getting to the bottom of it, the distinction seems to be that the beliefs you're describing can be 'candidates' to be official beliefs (with enough evidence), while Gendler's 'aliefs' are basically emotional reactions that might lead to a belief, but it's content is not intellectual by itself. It's a thin line though.

I like this post, but I don't like the title. I don't see what it has to do with the content, and it seems to assert high status.

Even ignoring Alicorn's actual explanation, given that she is the third-highest karma contributor, it's fair to say that she does have high status here.

It's my impression that, regardless of whether or not you actually have status, acting like you do is probably undesirable, as it gets you thinking in the wrong patterns.

It means that I endorse the contents of the post, which is about endorsed beliefs. It wasn't meant to be status-asserting.

I was also confused by the title. After this explanation I "get it," but it wasn't obvious to me at all. Not really a criticism, just a comment.

I also get what the intent of the title was, but I can't help but feel like it cheapens an otherwise excellent post (and it might make some people skip over it altogether).

I found it odd, but not so much as to discourage me from reading the post; and I found it amusing and clever after reading it.

Maybe put it in the subtitle?

I'm not sure what you mean by "cheapening" the post - explain? I'm concerned by the possibility that the title might encourage skipping. Any suggestions on what, if anything, would be a better substitute?

In general, if I saw that title on Reddit or Hacker News or in my feed reader, I would never click on it. It's kind of generic and doesn't really tell you much about the content, and that phrase is highly associated with political TV ads (which are uniformly bad), a Very Bad Thing IMHO. I was actually going to skip it, but saw your username and the number of votes it had.

As a LW post specifically, I feel like almost all the "classic" OB/LW posts had memorable titles that make it easy to remember what the post is about and serve as a kind of hook/shortcut for that particular concept (f.ex. Probability is in the Mind, The map is not the Territory, Avoiding Your Belief’s Real Weak Points, Cached Thoughts, The Beauty of Settled Science, An Alien God, Applause Lights -- I'm sure that when you read these titles, if you've read the posts, you immediately remember the central concept of each of these posts).

If I think about the title of your post, I have to make an effort to remember what it's about, and I predict that I'll have a harder time remembering it over time because of the title.

But maybe that's just me and it doesn't bother others, though.

Apparently you have some similarly minded peers; I'll change the title.

Personally, I was a fan of the previous title. The perils of not speaking out, alas.

I didn't mind the old one, but I do like the "sticky brains" label that we can use for this concept in the future.

As a LW post specifically, I feel like almost all the "classic" OB/LW posts had memorable titles that make it easy to remember what the post is about and serve as a kind of hook/shortcut for that particular concept

I think Alicorn was making a reference to how ads for political candidates in the US now have to include the candidate saying "I'm [so-and-so], and I approve this message." That, or the meme has just caught on. Either way, it may help to know the significance of that phrasing.

(Or it's just a coincidental phrasing, but who care's about that possibility...)

(Previous title was IIRC "I'm Alicorn, and I approve this message.")

I understand the joke, but the title nonetheless reminded me of the statements that political candidates make at the end of their commercials.

Of course. It's supposed to. I repurposed the wording because it amused me to do so. I'm sorry if you don't like it.

If I know they're probably not true... And I do.  But they still make me expect things.
[...]
I'll supply an example.  I have a rather dreadful phobia of guns, and accordingly, I think they should be illegal.

I would say that this isn't quite an example, as thinking that guns should be illegal isn't an expectation. If what you say about your mind in this post is still true today (10 years later), would you give an example of a literal anticipation that you have that you would still describe as "something you know is false"?

I'm not sure that your "unofficial beliefs" are different from "things you don't believe but haven't realized it yet." For example:

Suppose you were born to a religious family, one that went to the First Church of Gun Control, and your parents homeschooled you due to (probably ungrounded) fears that the NRA would corrupt your mind at a public school. Your parents are so devout that they don't own a bow-and-arrow, a crossbow or a slingshot (rocks are ok, though, if thrown underhand). One day you go home and explain the above to your parents.

They will respond, after an uncomfortable silence, "...so you're saying you don't believe in gun control any more?"

You: "No, I believe it, just not officially."

Them: ...

(apologies for involving your parents in this)

I think, if it was a religious belief, considering something inappropriate to believe would be just about as frowned upon as actually not believing it. Religious beliefs tend to come with little tags on them that say "and it is good and right to believe this", or they wouldn't be so virulent and so hard to attack. It does seem like some people don't have the experience I describe, and as I point out, that's a good thing: that doesn't mean it doesn't happen to others. It's probably like mental imagery, which some people have and some don't.

Reading this post made me realize that I believe this post. It's persuasive in a good way, so I expect to officially believe it as it sinks in and I think about it more. Meanwhile I unofficially believe it. I was hoping that that would be somewhat paradoxical when I started this comment, but unfortunately I think it's not.

Now, how do we balance this with fighting indecision?

Indeed. I have this particular problem, which manifests itself when I go shopping. I don't want to succumb to impulse buys, so I constantly question how much I really need any particular item. This results in me sometimes being unable to settle on anything less "justified" than bread and toothpaste.

You need to adjust your concept of "justified", and consider that fun and comfort are actual values, aspects of the true meaning of life.

According to some, skipping the bread might alleviate your need for toothpaste.

Though that's more obviously true if all you were eating is bread with toothpaste on it.

You might link to the Moore's Paradox post, which discusses a very similar distinction, but from a distance.