Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Belief Chains

9 ShannonFriedman 15 November 2014 11:09AM

A belief is an acceptance that a statement is true or that something exists.   As aspiring rationalists, we strive for our beliefs to be true, accurate, and minimally biased.     

You seldom see a single belief floating around.  Typically beliefs tend to group into clusters and chains.  In other words, if I believe that I am turning my thoughts into written words right now, that is not an isolated belief.  My belief chain might look something like this:

I have sight ->  The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers” before -> I am wiggling my fingers to make patterns ->  this is called typing -> I am typing on a computer -> the words I am thinking are being translated into writing.    

Why does it matter whether I see my beliefs as chains or whether I simply look at the highest level belief such as “the words I am thinking are being translated into written word”?

It matters because at each link in the chain of belief, there is potential for falsehood to be introduced.  The further I am away from the source of my high-level belief, the less likely my high-level belief is to be accurate.   

Say for example that a three year old is typing on their toy computer that does not have the standard typing functionality of my computer.  They could still have the same logic chain that I used:

I have sight ->  The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers”  before -> I am wiggling my fingers to make patterns ->  this is  called typing -> I am typing on a computer -> the words I am  thinking are being translated into writing.    

Belief chains can be corrupted in many ways.  Here are a few:

1.   Our intuitions tell us that the more interconnecting beliefs we have, and the more agreement between different beliefs, the more likely they are to be true, right?  We can check them against each other and use them as confirming evidence for one another.

These interconnections can come from the beliefs we have accumulated in our own minds, and also from trust relationships with other people.  We use interconnecting beliefs from other people just as we use interconnecting beliefs in our own minds.  While not good or bad in and of itself, the down side of this system of validation is how we fall victim to the various types of groupthink.  

This is easiest to talk about with a diagram.  In these diagrams, we are assuming that truth (yellow T circles) comes from a source at the bottom of the diagram. Beliefs not originating from truth are labeled with a (B).   As aspiring rationalists, truth is what we want.   

What is truth?

Truth is a description reflecting the underlying fundamental structure of reality. The reality does not change regardless of what perspective you are looking at it from. As an example, "I think therefore I am" is something most people agree is obviously a truth. Most people agree that the laws of physics, in some version, are truths.

What is a source of truth?

A source of truth is the bottom level of stuff that composes whatever you're talking about.  If you're programming, the data you're manipulating breaks down into binary 0s and 1s.  But in order to let you handle it faster and more intuitively, it's assembled into layers upon layers of abstracted superstructures, until you're typing nearly English-like code into a preexisting program, or drawing a digital picture with a tablet pen in a very analog-feeling way.  Working directly with the source all the time isn't a good idea - in fact, it's usually unfeasible - and most problems with a higher-level abstraction shouldn't be patched by going all the way down.  But if you utterly disconnect from the fact that computers are in binary under their GUIs, or that no compass and paper can create a genuinely equation-perfect circle, or that physics isn't genuinely Newtonian under the hood - you'll have nowhere to backtrack to if it turns out there was a wrong turn in your reasoning.  You won't be able to sanity-check if you tell yourself a long twisty story about human motivations and "shoulds" and then come up with an action to take on that basis.

Below is a diagram of a healthy chain of pure true belief originating from a source of truth.  

2.   Belief chains can get disconnected from the source of truth.   For example, say that there is a group which has based their philosophy on the understanding of a certain physicist.   Say that the physicist dies, and that the group continues with expanding on that same belief set, although they have not yet integrated one of the key links that the physicist had which connected the chain to a source of truth.  In this case, you can end up with a cluster of belief that looks something like this:

You now have a cluster of belief, that contains some truth, but is no longer linked to source of truth, and fills in the gaps with ungrounded propositions.  This is the sort of situation that leads to high levels of overconfidence, and what Alexander Pope referred to when he wrote:  “A little learning is a dangerous thing."

What does this metaphor look like in real world terms?

continue reading »

Circular belief updating

6 irrational 11 December 2013 06:26AM

This article is going to be in the form of a story, since I want to lay out all the premises in a clear way. There's a related question about religious belief.


Let's suppose that there's a country called Faerie. I have a book about this country which describes all people living there as rational individuals (in a traditional sense). Furthermore, it states that some people in Faerie believe that there may be some individuals there known as sorcerers. No one has ever seen one, but they may or may not interfere in people's lives in subtle ways. Sorcerers are believed to be such that there can't be more than one of them around and they can't act outside of Faerie. There are 4 common belief systems present in Faerie:

  1. Some people believe there's a sorcerer called Bright who (among other things) likes people to believe in him and may be manipulating people or events to do so. He is not believed to be universally successful.
  2. Or, there may be a sorcerer named Invisible, who interferes with people only in such ways as to provide no information about whether he exists or not.
  3. Or, there may be an (obviously evil) sorcerer named Dark, who would prefer that people don't believe he exists, and interferes with events or people for this purpose, likewise not universally successfully.
  4. Or, there may either be no sorcerers at all, or perhaps some other sorcerers that no one knows about, or perhaps some other state of things hold, such as that there are multiple sorcerers, or these sorcerers don't obey the above rules. However, everyone who lives in Faerie and is in this category simply believes there's no such thing as a sorcerer.

This is completely exhaustive, because everyone believes there can be at most one sorcerer. Of course, some individuals within each group have different ideas about what their sorcerer is like, but within each group they all absolutely agree with their dogma as stated above.

Since I don't believe in sorcery, a priori I assign very high probability for case 4, and very low (and equal) probability for the other 3.

I can't visit Faerie, but I am permitted to do a scientific phone poll. I call some random person, named Bob. It turns out he believes in Bright. Since P(Bob believes in Bright | case 1 is true) is higher than the unconditional probability, I believe I should adjust the probability of case 1 up, by Bayes rule. Does everyone agree? Likewise, the probability of case 3 should go up, since disbelief in Dark is evidence for existence of Dark in exactly the same way, although perhaps to a smaller degree. I also think the case 2 and case 4 have to lose some probability, since it adds up to 1. If I further call a second person, Daisy, who turns out to believe in Dark, I should adjust all probabilities in the opposite direction. I am not asking either of them about the actual evidence they have, just what they believe.

I think this is straightforward so far. Here's the confusing part. It turns out that both Bob and Daisy are themselves aware of this argument. So, Bob says, one of the reasons he believes in Bright, is because that's positive evidence for Bright's existence. And Daisy believes in Dark despite that being evidence against his existence (presumably because there's some other evidence that's overwhelming).

Here are my questions:

  1. Is it sane for Bob and Daisy to be in such a positive or negative feedback loop? How is this resolved?
  2. If Bob and Daisy took the evidence provided by their belief into account already, how does this affect my own evidence updating? Should I take it into account regardless, or not at all, or to a smaller degree?

I am looking forward to your thoughts.

The problem with too many rational memes

80 Swimmer963 19 January 2012 12:56AM

Like so many of my posts, this one starts with a personal anecdote. 

A few weeks ago, my boyfriend was invited to a community event through Meetup.com. The purpose of the meetup was to watch the movie The Elegant Universe and follow up with a discussion. As it turns out, this particular meetup was run by a man who I’ll call ‘Charlie’, the leader of some local Ottawa group designed to help new immigrants to Canada find a social support net. Which, in my mind, is an excellent goal. 

Charlie turned out to be a pretty neat guy, too: charismatic, funny, friendly, encouraging everyone to share his or her opinion. Criticizing or shutting out other people’s views was explicitly forbidden. It was a diverse group, as he obviously wanted it to be, and by the end everyone seemed to feel pretty comfortable. 

My boyfriend, an extremely social being whose main goal in life is networking, was raving by the end about what a neat idea it was to start this kind of group, and how Charlie was a really cool guy. I was the one who should have had fun, since I’m about 100 times more interested in physics than he is, but I was fuming silently. 

Why? Because, at various points in the evening, Charlie talked about his own interest in the paranormal and the spiritual, and the books he’d written about it. When we were discussing string theory and its extra dimensions, he made a comment, the gist of which was ‘if people’s souls go to other dimensions when they die, Grandma could be communicating with you right now from another dimension by tapping spoons.’ 

Final straw. I bit my tongue and didn’t say anything and tried not to show how irritated I was. Which is strange, because I’ve always been fairly tolerant, fairly agreeable, and very eager to please others. Which is why, when my brain responded ‘because he’s WRONG and I can’t call him out on it because of the no criticism rule!’ to the query of ‘why are you pissed off?’, I was a bit suspicious of that answer. 

I do think that Charlie is wrong. I would have thought he was wrong a long time ago. But it wouldn’t have bothered me; I know that because I managed to attend various churches for years, even though I thought a lot of their beliefs were wrong, because it didn’t matter. They had certain goals in common with me, like wanting to make the world a better place, and there were certain things I could get out of being a community member, like incredibly peaceful experiences of bliss that would reset my always-high stress levels to zero and allow me to survive the rest of the week. Some of the sub-goals they had planned to make the world a better place, like converting people in Third World countries to Christianity, were ones that I thought were sub-optimal or even damaging. But overall, there were more goals we had in common than goals we didn’t have in common, and I could, I judged, accomplish those goals we had in common more effectively with them than on my own. And anyway, the church would still be there whether or not I went; if I did go, at least I could talk about stuff like physics with awe and joy (no faking required, thinking about physics does make me feel awe and joy), and increase some of the congregation’s scientific literacy a little bit. 

Then I stopped going to church, and I started spending more time on Less Wrong, and if I were to try to go back, I’m worried it would be exactly the same as the community meetup. I would sit there fuming because they were wrong and it was socially unacceptable for me to tell them that. 

I’m worried because I don’t think those feelings are the result of a clearheaded, logical value calculation. Yeah, churches and people who believe in the paranormal waste a lot of money and energy, which could be spent on really useful things otherwise. Yes, that could be a valid reason to reject them, to refuse to be their allies even if some of your goals are the same. But it’s not my true rejection. My true rejection is that them being wrong is too annoying for me to want to cooperate. Why? I haven’t changed my mind, really, about how much damage versus good I think churches do for the world. 

I’m worried that the same process which normalized religion for me is now operating in the opposite direction. I’m worried that a lot of Less Wrong memes, ideas that show membership to the ‘rationalist’ or ‘skeptic’ cultures, such as atheism itself, or the idea that religion is bad for humanity...I’m worried that they’re sneaking into my head and becoming virulent, that I'm becoming an undiscriminating skeptic. Not because I’ve been presented with way more evidence for them, and updated on my beliefs (although I have updated on some beliefs based on things I read here), but because that agreeable, eager-to-please subset of my brains sees the Less Wrong community and wants to fit in. There’s a part of me that evaluates what I read, or hear people say, or find myself thinking, and imagines Eliezer’s response to it. And if that response is negative...ooh, mine had better be negative too. 

And that’s not strategic, optimal, or rational. In fact, it’s preventing me from doing something that might otherwise be a goal for me: joining and volunteering and becoming active in a group that does good things for the Ottawa community. And this transformation has managed to happen without me even noticing, which is a bit scary. I’ve always thought of myself as someone who was aware of my own thoughts, but apparently not. 

Anyone else have the same experience? 

Entangled with Reality: The Shoelace Example

18 lukeprog 25 June 2011 04:50AM

Less Wrong veterans be warned: this is an exercise in going back to the basics of rationality.

Yudkowsky once wrote:

What is evidence?  It is an event entangled, by links of cause and effect, with whatever you want to know about.  If the target of your inquiry is your shoelaces, for example, then the light entering your pupils is evidence entangled with your shoelaces.  This should not be confused with the technical sense of "entanglement" used in physics - here I'm just talking about "entanglement" in the sense of two things that end up in correlated states because of the links of cause and effect between them.


Here is the secret of deliberate rationality - this whole entanglement process is not magic, and you can understand it.  You can understand how you see your shoelaces.  You can think about which sort of thinking processes will create beliefs which mirror reality, and which thinking processes will not.

Much of the heuristics and biases literature is helpful, here. It tells us which sorts of thinking processes tend to create beliefs that mirror reality, and which ones don't.

Still, not everyone understands just how much we know about exactly how the brain becomes entangled with reality by chains of cause and effect. Because "Be specific" is an important rationalist skill, and because concrete physical knowledge is important for technical understanding (as opposed to merely verbal understanding), I would like to summarize1 some of how your beliefs become entangled with reality when a photon bounces off your shoelaces into your eye.

continue reading »

Foma: Beliefs that Cause Themselves to be True

21 atucker 20 June 2011 05:13AM

tl;dr: Sometimes it seems like in order to accomplish something, you need to hold a particular belief. However, the effect of your beliefs on what you accomplish can be screened off from what you actually do.

Also, thank you to Benquo for reading over a rough draft of this and providing very helpful comments.

Foma: Beliefs that Cause Themselves to be True

Live by the foma [harmless untruths] that make you brave and kind and healthy and happy ~ Cat's Cradle

When I was younger, I had formed an idea that there were some beliefs that, when believed, caused themselves to be true. I even had a name picked out for them – foma. These are just a few examples of how I came to think that.

“This is awkward” very often makes things awkward.

Consider walking through a room with a group of people that you don't know very well all talking and laughing. One or two look at you, and you just sort of stare back. “Well,” you think, “this is awkward”.

You stare blankly before letting out an uneasy laugh, and you go on your way.  You can feel people watching you walk out the door.

If you just walk through the room without thinking about it at all, its not even emotionally salient enough for you to wonder how it feels.

When I got over my fear of public speaking, it was basically because of a fluke. I decided to do a presentation on the mistakes of Odysseus' crew in character as Odysseus. People then assumed that my shaky arms, legs, and voice were the result of me doing a good portrayal of a shaken Odysseus, rather than my being nervous.

After that, I thought public speaking wasn't so hard as long as I feel comfortable doing it. Taking a few steps to mitigate my physical signs of nervousness (like walking around, or standing behind a podium), I quickly became pretty comfortable doing it.

“I'm not a good public speaker” worsened my public speaking skills, and “I can do this” strengthened them. Areas like self-confidence seem to possibly be foamy.

However on closer reflection, that model is incomplete.

continue reading »

The peril of ignoring emotions

15 Swimmer963 03 April 2011 05:15PM

Related to: Luminosity Sequence, Unknown Knowns

Let me introduce you to a hypothetical high school student, Sally. She’s smart and pretty and outgoing, and so are her friends. She considers herself a modern woman, sexually liberated, and this is in line with the lifestyle her friends practice. They think sex is normal and healthy and fun. Sally isn’t just pretending in order to fit in; these really are her friends, this really is her milieu, and according to health class, sex between consenting adults is nothing to be ashamed of. Sally isn't a rigorous rationalist, although she likes to think of herself as rational, and she's no more self-aware than the average high school girl. 

Now Sally meets a boy, Bob, and she things he’s cute, and he thinks she’s cute too. Bob is part of her crowd. Her friends like him; he respects women and treats Sally well and, like any healthy teenage boy, fairly horny. According to her belief system, that shouldn’t set off any alarm bells. She’s been warned about abusive relationships, but Bob is a nice guy. So when they go upstairs together at her friend’s party, she has every reason to be excited and a little nervous, but not uncomfortable. The idea that Mom wouldn’t approve is so obviously irrelevant that she ignores it completely.

...And afterwards, she feels guilty and violated and horrible about herself, even though it was her decision.

I used this example because I expect it’s not unusual. On the surface, Sally’s discomfort seems to come out of nowhere, but modern North American society is chock-full of contradictory beliefs about sex. Sex is normal and healthy. Sex is dirty. Sex is only for when you’re married. If Sally’s mother is Christian, or even just conservative, Sally would have internalized those beliefs when she was a child. It would have been hard not to. They’re her unknown knowns, and she may not have noticed them before, because there’s a wide psychological gap between believing it’s okay for others to behave a particular way, and believing it’s okay for you. The meme ‘don’t pass judgement on other people’ is, I think, pretty widespread in North America and maybe more so in Canada, but so is holding oneself to a high standard...and those are contradictory.

I think that the nagging, seemingly irrational moment of ‘that doesn’t feel right’ is important. It potentially reveals something about the beliefs and attitudes you hold that you don’t even know about. Sally’s response to her nagging doubt could have been the following:

Hmm, that’s interesting, why does it bother me so much that Mom would disapprove? I guess when we used to go to church, they said sex was only for when you’re married. But I don’t believe anything else they said in church. ...Well, I guess I want Mom to be proud of me. I want her to praise me for doing well in school. And I think lying is wrong, so the fact that I either have to lie to her about having had sex, or face her disapproval, maybe that’s why I’m uncomfortable? But I don’t want to say no, it’ll make me look like a prude... Still, what if everyone feels this way at the start? I know Alice went to church too when she was a kid, and her mom would kill her if she knew she was sexually active, I wonder if that bothers Alice? Hmm, I think maybe it’s still the right choice to sleep with Bob, but maybe I’m taking this too lightly? Maybe this should be a big deal and I should feel anxious? After all, he might judge me anyway, he might think I’m too easy, or a slut. Maybe I can just explain to him that I want to think about this longer... After all, why should I assume something is right just because they told us in health class? That’s just like in church, it’s taking someone else’s opinion on faith. I’ve never actually thought about this, I’ve just followed other people. Who’s to say they’re right?

Whatever decision Sally makes, she probably won’t feel violated. She listened to her feelings and took them into consideration, even though they seemed irrational. As it turned out, they were a reasonable consequence of a belief-fragment that she hadn’t even known she had. So as a consequence of stopping to think, she knows herself better too. She’ll be better able to predict her behaviour in future situations. She’ll be less likely to ignore her threshold-warning discomfort and make risky choices as a result of peer pressure alone. She’ll be more likely to think.

To conclude: emotions exist. They are real. If you ignore them and plow on ahead, you won’t necessarily thank yourself afterwards. And that nagging feeling is a priceless moment to find out about your unknown knowns...which may not be rational, which may have been laid down in some previous era and never questioned since, but which part of you is going to try to uphold until you consciously deconstruct them. 

View more: Next