Belief Chains
A belief is an acceptance that a statement is true or that something exists. As aspiring rationalists, we strive for our beliefs to be true, accurate, and minimally biased.
You seldom see a single belief floating around. Typically beliefs tend to group into clusters and chains. In other words, if I believe that I am turning my thoughts into written words right now, that is not an isolated belief. My belief chain might look something like this:
I have sight -> The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers” before -> I am wiggling my fingers to make patterns -> this is called typing -> I am typing on a computer -> the words I am thinking are being translated into writing.

Why does it matter whether I see my beliefs as chains or whether I simply look at the highest level belief such as “the words I am thinking are being translated into written word”?
It matters because at each link in the chain of belief, there is potential for falsehood to be introduced. The further I am away from the source of my high-level belief, the less likely my high-level belief is to be accurate.
Say for example that a three year old is typing on their toy computer that does not have the standard typing functionality of my computer. They could still have the same logic chain that I used:
I have sight -> The image coming into my eyes is of something that is metallic with bright lights and little boxes -> It is similar to things that have been called “computers” before -> I am wiggling my fingers to make patterns -> this is called typing -> I am typing on a computer -> the words I am thinking are being translated into writing.
Belief chains can be corrupted in many ways. Here are a few:
1. Our intuitions tell us that the more interconnecting beliefs we have, and the more agreement between different beliefs, the more likely they are to be true, right? We can check them against each other and use them as confirming evidence for one another.
These interconnections can come from the beliefs we have accumulated in our own minds, and also from trust relationships with other people. We use interconnecting beliefs from other people just as we use interconnecting beliefs in our own minds. While not good or bad in and of itself, the down side of this system of validation is how we fall victim to the various types of groupthink.
This is easiest to talk about with a diagram. In these diagrams, we are assuming that truth (yellow T circles) comes from a source at the bottom of the diagram. Beliefs not originating from truth are labeled with a (B). As aspiring rationalists, truth is what we want.
What is truth?
Truth is a description reflecting the underlying fundamental structure of reality. The reality does not change regardless of what perspective you are looking at it from. As an example, "I think therefore I am" is something most people agree is obviously a truth. Most people agree that the laws of physics, in some version, are truths.
What is a source of truth?
A source of truth is the bottom level of stuff that composes whatever you're talking about. If you're programming, the data you're manipulating breaks down into binary 0s and 1s. But in order to let you handle it faster and more intuitively, it's assembled into layers upon layers of abstracted superstructures, until you're typing nearly English-like code into a preexisting program, or drawing a digital picture with a tablet pen in a very analog-feeling way. Working directly with the source all the time isn't a good idea - in fact, it's usually unfeasible - and most problems with a higher-level abstraction shouldn't be patched by going all the way down. But if you utterly disconnect from the fact that computers are in binary under their GUIs, or that no compass and paper can create a genuinely equation-perfect circle, or that physics isn't genuinely Newtonian under the hood - you'll have nowhere to backtrack to if it turns out there was a wrong turn in your reasoning. You won't be able to sanity-check if you tell yourself a long twisty story about human motivations and "shoulds" and then come up with an action to take on that basis.
Below is a diagram of a healthy chain of pure true belief originating from a source of truth.

2. Belief chains can get disconnected from the source of truth. For example, say that there is a group which has based their philosophy on the understanding of a certain physicist. Say that the physicist dies, and that the group continues with expanding on that same belief set, although they have not yet integrated one of the key links that the physicist had which connected the chain to a source of truth. In this case, you can end up with a cluster of belief that looks something like this:

You now have a cluster of belief, that contains some truth, but is no longer linked to source of truth, and fills in the gaps with ungrounded propositions. This is the sort of situation that leads to high levels of overconfidence, and what Alexander Pope referred to when he wrote: “A little learning is a dangerous thing."
What does this metaphor look like in real world terms?
Information theory and the symmetry of updating beliefs
Contents:
1. The beautiful symmetry of Bayesian updating
2. Odds and log odds: a short comparison
3. Further discussion of information
Rationality is all about handling this thing called "information". Fortunately, we live in an era after the rigorous formulation of Information Theory by C.E. Shannon in 1948, a basic understanding of which can actually help you think about your beliefs, in a way similar but complementary to probability theory. Indeed, it has flourished as an area of research exactly because it helps people in many areas of science to describe the world. We should take advantage of this!
The information theory of events, which I'm about to explain, is about as difficult as high school probability. It is certainly easier than the information theory of multiple random variables (which right now is explained on Wikipedia), even though the equations look very similar. If you already know it, this can be a linkable source of explanations to save you writing time :)
So! To get started, what better way to motivate information theory than to answer a question about Bayesianism?
The beautiful symmetry of Bayesian updating
The factor by which observing A increases the probability of B is the same as the factor by which observing B increases the probability of A. This factor is P(A and B)/(P(A)·P(B)), which I'll denote by pev(A,B) for reasons to come. It can vary from 0 to +infinity, and allows us to write Bayes' Theorem succinctly in both directions:
P(A|B)=P(A)·pev(A,B), and P(B|A)=P(B)·pev(A,B)
What does this symmetry mean, and how should it affect the way we think?
A great way to think of pev(A,B) is as a multiplicative measure of mutual evidence, which I'll call mutual probabilistic evidence to be specific. If pev=1 if they're independent, if pev>1 they make each other more likely, and if pev<1 if they make each other less likely.
But two ways to think are better than one, so I will offer a second explanation, in terms of information, which I often find quite helpful in analyzing my own beliefs:
My Failed Situation/Action Belief System
Note: This is a description pieced together many, many years after my younger self subconsciously created it. This is part of my explanation of how I ended up me. I highly doubt all of this was as neatly defined as I present it to you here. Just know: The me in this post is me between the age of self-awareness and 17 years old. I am currently 25.
An action based belief system asks what to do when given a specific scenario. The input is Perceived Reality and the output is an Action. Most of my old belief system was built with such beliefs. A quick example: If the stop light is red, stop before the intersection.
These beliefs form a network of really complicated chains of conditionals:
- If the stop light is red
- And you are not stopped
- Stop in the next available space before the intersection
View more: Next
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)