Cross-Posted on By Way of Contradiction
As you may know from my past posts, I believe that probabilities should not be viewed as uncertainty, but instead as weights on how much you care about different possible universes. This is a very subjective view of reality. In particular, it seems to imply that when other people have different beliefs than me, there is no sense in which they can be wrong. They just care about the possible futures with different weights than I do. I will now try to argue that this is not a necessary conclusion.
First, let's be clear what we mean by saying that probabilities are weights on values. Imagine I have an unfair coin which give heads with probability 90%. I care 9 times as much about the possible futures in which the coin comes up heads as I do about the possible futures in which the coins comes up tails. Notice that this does not mean I want to coin to come up heads. What it means is that I would prefer getting a dollar if the coin comes up heads to getting a dollar if the coin comes up tails.
Now, imagine that you are unaware of the fact that it is an unfair coin. By default, you believe that the coin comes up heads with probability 50%. How can we express the fact that I have a correct belief, and you have an incorrect belief in the language of values?
We will take advantage of the language of terminal and instrumental values. A terminal value is something that you try to get because you want it. An instrumental value is something that you try to get because you believe it will help you get something else that you want.
If you believe a statement S, that means that you care more about the worlds in which S is true. If you terminally assign a higher value to worlds in which S is true, we will call this belief a terminal belief. On the other hand, if you believe S because you think that S is logically implied by some other terminal belief, T, we will call your belief in S an instrumental belief.
Instrumental values can be wrong, if you are factually wrong about the fact that the instrumental value will help achieve your terminal values. Similarly, an Instrumental belief can be wrong if you are factually wrong about the fact that it is implied by your terminal belief.
Your belief that the coin will come up heads with probability 50% is an instrumental belief. You have a terminal belief in some form of Occam's razor. This causes you to believe that coins are likely to behave similarly to how coins have behaved in the past. In this case, that was not valid, because you did not take into consideration the fact that I chose the coin for the purpose of this thought experiment. Your Instrumental belief is in this case wrong. If your belief in Occam's razor is terminal, then it would not be possible for Occam's razor to be wrong.
This is probably a distinction that you are already familiar with. I am talking about the difference between an axiomatic belief and a deduced belief. So why am I viewing it like this? I am trying to strengthen my understanding of the analogy between beliefs and values. To me, they appear to be two different sides of the same coin, and building up this analogy might allow us to translate some intuitions or results from one view into the other view.
Logical uncertainty still gets probabilities just like they used to. Only indexical uncertainty gets pushed into the realm of values.
(At least for now, while I am thinking about the multiverse as tegmark 4. I am very open to the possibility that eventually I will believe even logically inconsistent universes exist, and then they would get the same fate as indexical uncertainty)
In one model I considered I put tegmark 4 as the one weighted according to my values, and called the set of different counterfactual universes other agents might care about as tegmark 5. This was mainly for the purpose of fiction where it filled a role as a social convention among agents with very different values of this type, but it's an interesting idea of what the concept might look like.
These could by the way not necessarily be just quantitatively different weights over the same set of universes. For example we can imagine that it turns out humans an... (read more)