A belief is an acceptance that a statement is true or that something exists.   As aspiring rationalists, we strive for our beliefs to be true, accurate, and minimally biased.     

You seldom see a single belief floating around.  Typically beliefs tend to group into clusters and chains.  In other words, if I believe that I am turning my thoughts into written words right now, that is not an isolated belief.  My belief chain might look something like this:

I have sight ->  The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers” before -> I am wiggling my fingers to make patterns ->  this is called typing -> I am typing on a computer -> the words I am thinking are being translated into writing.    

Why does it matter whether I see my beliefs as chains or whether I simply look at the highest level belief such as “the words I am thinking are being translated into written word”?

It matters because at each link in the chain of belief, there is potential for falsehood to be introduced.  The further I am away from the source of my high-level belief, the less likely my high-level belief is to be accurate.   

Say for example that a three year old is typing on their toy computer that does not have the standard typing functionality of my computer.  They could still have the same logic chain that I used:

I have sight ->  The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers”  before -> I am wiggling my fingers to make patterns ->  this is  called typing -> I am typing on a computer -> the words I am  thinking are being translated into writing.    

Belief chains can be corrupted in many ways.  Here are a few:

1.   Our intuitions tell us that the more interconnecting beliefs we have, and the more agreement between different beliefs, the more likely they are to be true, right?  We can check them against each other and use them as confirming evidence for one another.

These interconnections can come from the beliefs we have accumulated in our own minds, and also from trust relationships with other people.  We use interconnecting beliefs from other people just as we use interconnecting beliefs in our own minds.  While not good or bad in and of itself, the down side of this system of validation is how we fall victim to the various types of groupthink.  

This is easiest to talk about with a diagram.  In these diagrams, we are assuming that truth (yellow T circles) comes from a source at the bottom of the diagram. Beliefs not originating from truth are labeled with a (B).   As aspiring rationalists, truth is what we want.   

What is truth?

Truth is a description reflecting the underlying fundamental structure of reality. The reality does not change regardless of what perspective you are looking at it from. As an example, "I think therefore I am" is something most people agree is obviously a truth. Most people agree that the laws of physics, in some version, are truths.

What is a source of truth?

A source of truth is the bottom level of stuff that composes whatever you're talking about.  If you're programming, the data you're manipulating breaks down into binary 0s and 1s.  But in order to let you handle it faster and more intuitively, it's assembled into layers upon layers of abstracted superstructures, until you're typing nearly English-like code into a preexisting program, or drawing a digital picture with a tablet pen in a very analog-feeling way.  Working directly with the source all the time isn't a good idea - in fact, it's usually unfeasible - and most problems with a higher-level abstraction shouldn't be patched by going all the way down.  But if you utterly disconnect from the fact that computers are in binary under their GUIs, or that no compass and paper can create a genuinely equation-perfect circle, or that physics isn't genuinely Newtonian under the hood - you'll have nowhere to backtrack to if it turns out there was a wrong turn in your reasoning.  You won't be able to sanity-check if you tell yourself a long twisty story about human motivations and "shoulds" and then come up with an action to take on that basis.

Below is a diagram of a healthy chain of pure true belief originating from a source of truth.  

2.   Belief chains can get disconnected from the source of truth.   For example, say that there is a group which has based their philosophy on the understanding of a certain physicist.   Say that the physicist dies, and that the group continues with expanding on that same belief set, although they have not yet integrated one of the key links that the physicist had which connected the chain to a source of truth.  In this case, you can end up with a cluster of belief that looks something like this:

You now have a cluster of belief, that contains some truth, but is no longer linked to source of truth, and fills in the gaps with ungrounded propositions.  This is the sort of situation that leads to high levels of overconfidence, and what Alexander Pope referred to when he wrote:  “A little learning is a dangerous thing."

What does this metaphor look like in real world terms?

Let’s start with examining the beliefs of an average Unites States citizen with regard to an organization that is generally well known and respected, the FDA.  

What source of truth should the FDA be connected to from the point of view of an individual examining belief chains? 

Here are some clues: 

The FDA’s vision statement is:

“All food is safe; all medical products are safe and effective; and the public health is advanced and protected.”

What does “safe” mean?   

What is it that does not change regardless of perspective, that “safety” is anchored to?

It could be that a human being lives to a certain age.  It could be that only a certain percentage of the population dies per year.  It could be a model of what health is - such as certain firings of neurons related to pain as measured by an MRI.  

Personally, I do not know of any such sources that the FDA is being measured by.  Thus, at least according to my map, the FDA is not anchored.  This does not mean that the FDA is not coherent in some sort of internal belief system, or that another individual does not have FDA-related beliefs that are anchored to a source of truth that I am currently unaware of.   

However, my guess is that very few people have beliefs anchored to a truth source when it comes to the FDA, though they tend to adopt FDA recommendations.  

Here is an example are some of the beliefs I see in the FDA belief cluster that many people have:

The FDA is an organization in charge of food and drug safety <-> The FDA does extensive research <->  The FDA is trustworthy and thorough <->  The FDA is more trustworthy than an individual <-> You should put your trust into the FDA approved products more than non-FDA approved products <-> products approved by the FDA are safe <->  products approved by the FDA are the most safe you can get <-> products approved by the FDA are more trustworthy than any non-government organization <-> you should put your trust in the FDA

Some of those beliefs I would label with a (B), and some I might label with a (T).  Note that this belief cluster does not trace all the way down to the level of math or science.   I know that some amount of science is involved.  I do not actually know the quality of the research.  I do not know what the controls used in their controlled studies are.  I do not personally know the details of how thorough their research actually is.  I do not know if they are iteratively updating their research standards as more knowledge is gained within the scientific community.  I do not know what they consider important to control for.  I do not know how many people are in their studies.  I do not know if they try to have racial, age, or gender diversity in their studies.  I do not know if their studies are location or weather specific.  I do not know how well they maintain the testing equipment that they use in their studies.  

Note that many of the things I do not know could significantly influence the outcome of studies, and thus, how trustworthy the FDA actually is.   

Clearly many people are not satisfied with the results of that this belief cluster provides.  There is a pattern that has emerged of the FDA approving a drug, and then this approval being followed up a few years later with a class action lawsuit due to the large scale damage from the drug.  People generally have considerable faith in FDA testing, and a high willingness to take medication that has been approved by it.  Yet there have been repeated cases of things like boys growing breasts taking Risperdal, birth defects from Effexor, and suicidal tendencies from Paxil - just to name a few.  

This example is not chosen to pick on the FDA as being unusually bad or wrong - in my experience, very few people look at the roots of any organization that they feel inclined to trust. 

Because organizations are just collections of people, this means that in regard to organizations with large masses of beliefs surrounding them (e.g. what individual people believe and tell each other about the FDA), most of the mass of the belief surrounding the organization is interconnected ungrounded belief, with a lot of (B)s rather than connected to a source of truth via (T)s.

This is largely because very few individuals are solidly grounded in sources of truth.  When a person is not grounded in a source of truth in their own personal belief systems, they tend to not have faith in their own perceptions of reality (for good reason), and instead tend to look to other individuals and organizations for security.  It is not unusual for an individual to place more faith in another individual or organization than they do in themselves.   

Collecting people into an organization does not inherently make the individuals or their collective creations more rational, although it often makes the individuals within the collection more confident in their agreed upon belief cluster, due to the additional people adding additional belief connections.  

If an ungrounded individual was lucky enough to find a grounded individual or organization to place faith in, then this strategy might be effective.  But when an individual or organization being grounded to a source of truth is the exception rather than the rule, you just end up with a whole lot of ungrounded people and clusters like in the diagram above.

What does ungroundedness in an individual look like?

Say that as a girl, I have a bad experience with a boy in the third grade.  This guy is a jerk and picks on me.  I develop a story in my head that the male gender is bad and evil.  I start collecting evidence for this point of view.  I look suspiciously at all boys, even after the original bully is out of my life.  While most may not notice this, others who are bullies can sense my fear and realize that I would be a very reactive person who would be fun for them to bully.  

I end up becoming more or less a magnet for abuse, which reinforces my belief that boys are bad, and that the world is generally an unfriendly and cruel place.  Of the many people I meet over the course of my life, I am especially intrigued by those who have a similar experience to my own, and I become a magnet attracting people who have been victimized as well.  I have less resonance with people who have not had a similar experience, so proportionally I have very few of them who become good friends and stay in my life.  

I have created this situation, where it seems true in my ungrounded world that most of the men who stay on my radar are horrible; and most of the women, and perhaps a few men, are victims.  Since I am surrounded by evidence for my point of view, if I am not actively seeking other hypotheses, I can live out my entire life in this alternate reality, which is very different than the reality many other people live.  

My belief cluster might include include something like this:

men are mean <-> men pick on me <-> men who aren’t mean get victimized <-> almost all women are victims <-> life is not fun <-> human beings are shitty <-> the world is an unsafe place

How is this not grounded?

Here are some questions not answered by this belief chain which point at some missing chain links:

How and why is it that we evolved to be this way?  Why do movies and books which are popular with  both genders portray men with admirable qualities such as kindness, loyalty and love?  Why is there charity?  How is it that there are many women in powerful positions in organizations?    

As another example of an ungrounded life experience and perspective, I will give one on the other end of this spectrum.  Say that I am “the good kid.”  I am the straight A student, who the teachers all love.  I am shielded from negativity, and my creative and intellectual talents are nurtured.  I don’t notice the kids being picked on, but I do notice other kids like me.  I am surrounded by people who are excelling.  I see how we are special, and I feel very special and important.  I don’t understand why everyone is not as talented as I am, but I know that I am special and deserving, and I go through life feeling this way - seeking out the next opportunity, and not having much compassion for those who are not as talented as I am.  I don’t understand why they don’t just get out of their drama, but it isn’t my problem.  

Here are some beliefs that might be in this person’s cluster:

life is easy <-> most people are stupid <-> most people are lazy <-> I am special <-> I am talented <-> I deserve to be treated especially well because I am a productive society member <-> I only spend time with other special people <-> people who are lazy and don’t have their act together are inferior and don’t deserve my attention <-> I am more important than other people due to my high productivity <-> I deserve to get what I want <-> other people should give me what I want

Here are some questions which point at some missing chain links for this person:

How has society been built up to the point where life is easy for me?   Why is it that it is not easy for everyone?   Is there an inherent reason why I am more deserving than others, that other people should feel the same about?  If so, what is this reason that would be clear to everyone if they were to understand it?  I.e.: does my life make theirs significantly better?   What goes on in the minds and feelings of other people?   How do other people contribute to my life?  How has the world been created in a way that I have access to everything I need to live the life I have now?  

Both of those belief patterns are self reinforcing, with a lot of interconnecting evidence for the person experiencing them to feel that they have a complete and stable model of reality.  

However, people with either of these patterns can go through very big shifts if they happen to get jostled out of their self-reinforcing patterns.  For example, if someone in the second example who has had an easy life ends up going through a hardship such as a natural disaster, that could result in a radical shift of belief.  Suddenly things are no longer easy for this person, and they start to struggle themselves and understand why it is that others struggle.  Or perhaps someone who has had everything easy for them ends up with a brain tumor and partial paralysis.  Their belief system about being superior because of high levels of productivity is likely to shift, and likely in ways not anticipated by the individual.  They could just as easily adopt a new belief system in which they are no longer superior, as they might adopt a new belief system in which it is now a partially paralyzed person who is superior rather than a highly productive one.  The woman in the first example may eventually find a good counselor who teaches her how to attract nice men who don't have her victim mentality into her life.   

If a group or individual is truly grounded in sources of truth, then their belief system should be stable.  If the belief system shifts radically, this is a sign that at least in the past, it was not well grounded in a source of truth.  Belief systems held by living people will of course shift as they learn and grow, but the shifts will be mostly in the branches, not pulling up and moving the roots.  

3.   Belief chains can be corrupted by power hungry individuals and opportunists.  In addition to naturally occurring disruptions in belief chains, someone very strategic can get ahold of a belief cluster or chain and rework it to their satisfaction.  This is why people are suspicious of marketers and people selling things, and anything that feels at all manipulative.  

Unfortunately, with the advance of the information age, savvy manipulators have also gotten access to more information and are much smarter and better at evading detection.  

Because of general ignorance in the area of belief chains, it makes most of society easy targets for smart manipulators with even a modest but above-average level of awareness.   Overconfident and ignorant people make very easy targets, since they do not take precautions to protect themselves.    

We all have seen countless movies where there are very complex manipulation plots, and complex manipulation does happen in the real world, in addition to more common simple manipulation.  

Is this a reason to be paranoid?  

No.  It is a reason to be grounded in reality.  Take appropriate precautions.

Bad things do happen.  They range from an annoying marketer taking advantage of you, to a friend lying to you, to shoplifting, to rape and murder to genocide.  That is part of reality.  While it is best to enjoy life and not to live in fear of these things, it is also best to not pretend they don’t exist.   

What are a couple of examples of belief chains and clusters being strategically manipulated in a way that hurts the belief chain host?  

- Companies using expert marketing that resonates with you on a fundamental level but has nothing to do with what they are selling, in order to get you to spend your money on products you don’t need, when you could have used that money to improve your life.

- Individuals convincing you to spend your time and energy helping them solve their problems, when persistently having these problems is their pattern of existence and they don’t give back to you or anyone else.

What can I do about all this to correct my biases and live rationally?  

1.  The more you think the concept of belief chains and tracing your beliefs to a source of truth, the more you will actually do so.1,2,3,4

According to these findings, perhaps simply reading an explanation of why having an ungrounded belief system is not healthy will increase your chance of fixing it as much as any other intervention.  

2.  Have fun with this.  The more you enjoy the process of truth seeking, the more you will do it.5

3.  Make it part of your mental model that all people including yourself are using a belief system that is fundamentally based on trust, and connections that have not been traced all the way to fundamental truth.  This will reduce your bias toward overconfidence both in your own beliefs and those of other people.  

http://hbswk.hbs.edu/item/7509.html

http://mina.education.ucsb.edu/janeconoley/ed197/documents/sheldonincreaseandsustainpositiveemotion.pdf

http://www.psychologytoday.com/blog/flourish/200912/seeing-is-believing-the-power-visualization

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3240747/

5  http://www.amazon.com/Dont-Shoot-Dog-Teaching-Training/dp/1860542387/ref=sr_1_1?ie=UTF8&qid=1415223754&sr=8-1&keywords=don%27t+shoot+the+dog

I am an online anxiety and productivity specialist, and the things written were inspired by my experience from my work. If you would like to learn more, here is my website.

New to LessWrong?

New Comment
13 comments, sorted by Click to highlight new comments since: Today at 11:02 AM

The FDA does extensive research

In most cases it doesn't. In most cases the FDA tells companies to do research and then show the FDA the results of that research so that the FDA approves the product.

Okay, so FDA is one step removed, and is reviewing the research rather than doing it themselves.

To be fair, you simply spoke about the cluster of beliefs that many people have and many people probably do believe that the FDA does extensive research on their own.

Issues like the fraud at Ranbaxy aren't well known.

Correct.

Specifically, around 10 percent of drugs which make it to Phase I clinical trials eventually get approved by the FDA, and many more drug candidates fail before reaching Phase I clinical trials. So, the FDA has to very carefully reject the vast majority of drugs which pharmaceutical companies attempt to get approved. Since most of the drugs which are submitted for FDA approval don't meet the FDA standards, it is not surprising that some drugs with negative side-effects get approved.

:Excellent piece! Very clear and interesting, and some ideas I either haven't seen before or that didn't register with me.

One of my examples of an inadequate belief is the common statement about the US Civil War that "the South seceded" . This is true, but I read (half of) Confederate Reckoning, which described the amount of politicking it took to get a bunch of states with different economies and interests to join to make secession happen.

Another premise in regards to the FDA is the belief that a US government organization is more trustworthy than similar regulatory organizations from other governments.

An underlying premise in your piece is that verbal thought is a good way to get at how people approach the world.

Very useful and instructive post. I would like to comment that one of the biggest tests(or so it seems to me) to check if a belief chain is valid or not is the test of resistance to arbitrary changes.

You write that systems like [I was abused]<->[people are meanies] <-> [life is horrible] <-> are stable and this is why people believe them; because they seem to hold sound under their own reasoning. But they are inherently not stable because they are not connected to the unshakable foundation of the source of truth(reality)!

Suppose you apply my test and /arbitrarily change one of the beliefs/. Let's say I decide to change the belief [I was abused] to [I was not abused] (which is an entirely plausible viewpoint to hold unless you think that everyone is abused). In that case, the whole chain falls apart, because if you were not abused, then it does not prove that people are meanies, which in turn implies a possible non-terrible world. And therefore the system is only stable on the surface. A house is not called solid if it can stand up; it is called solid if it can stand rough weather(arbitrary changes) without falling.

Let's look at the truthful chain [Laws of physics exist] <-> [Gravity exists] <-> [If I jump, I will not float]. In this case we can arbitrarily change the value of ANY belief and still have the chain repair itself. Let's say I say that the LAWS OF PHYSICS ARE FALSE. In that case I would merely say, "Gravity- supported by the observation that jumping peoples fall- proves or at least very strongly evidences the existence of a system of rules that govern our universe", and from there work out the laws of physics from basic principles at caveman level. It might take a long time, but theoretically it holds.

Now, if I say that gravity does not exist, a few experiments with the laws of physics -> gravity will prove me wrong. And If I decide to claim that if I jump, I will not fall, gravity, supported by the laws of physics, thinks otherwise(and enforces its opinion quite sharply).

The obvious assumption here is that there is a third person saying "these things are false" in the second example as opposed to god making a change in the first. But the key point is that actually stable (or true) belief chains cannot logically support such a random change without auto-repairing itself. It is impossible to imagine the laws of physics existing as they are and yet gravity being arbitrarily different for some reason. The truth of the belief chain holds all the way to the laws of physics down to quantum mechanics, which is the highest detail of reality we have yet to find.

It seems clear to me that the ability to support and repair an arbitrary chain is what differentiates true chains from bad chains.

27chaos, that is a very interesting paper and I thank you for the find. It's actually quite a happy coincidence as neural networks (having been prompted by the blegg sequence) was on my next-to-study list. Glad to be able to add this paper to my queue.

I like the holopeisis of this idea and how it analyzes belief and then gives it greater structure and meaning. I'm impressed and it was worth the time.

Great post.

Belief chains can be corrupted by power hungry individuals and opportunists. ... Unfortunately, with the advance of the information age, savvy manipulators have also gotten access to more information and are much smarter and better at evading detection.

This is what I keep telling people about the information traces they leave behind.

This is why I think that we do alread have a lot of Unfriendly Natural Intelligence perverting (sorry for the strong word) our complex-but-not-quite-information-age-ready values.

I think the examples you use are overly political and may make some readers feel uncomfortable.

The psychological stuff (in particular the woman who was attracting bullies) made me feel uncomfortable, but that's not a good enough reason for Shannon to not include such material.

If the political material (which I didn't think was inflammatory) actually leads to the comments blowing up, then I'll concede that you had a point.

When talking about the impacts of complex systems, it is useful to pick one that people know, so as to not have to spend a whole lot of words giving background explanation.

I could not think of an example to use for this that was not at all political. I do not think it being slightly political outweighs the value of the description.

Do you prefer only examining elements on a small enough scale that you can get close to perfection in comfort and lack of error (what margin of error is acceptable to you since perfection is generally not actually achievable?), or do you prefer to consider some things that are uncomfortable if there might be a high pay off in return for examining these areas that may be useful for improving your rationality skills?