Another month has passed and here is a new rationality quotes thread. The usual rules are:
- Please post all quotes separately, so that they can be upvoted or downvoted separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself.
- Do not quote from Less Wrong itself, Overcoming Bias, or HPMoR.
- No more than 5 quotes per person per monthly thread, please.
Part 1:
The idea of having a "true probability" can be extremely misleading. If I flip a coin but don't look at it, I may call it a 50% probability of tails, but reality is sitting right there in my hand with probability 100%. The probability is not in the external world - the coin is already heads or tails. The probability is just 50% because I haven't looked at the coin yet.
What sometimes confuses people is that there can be things in the world that we often think of as probabilities, and those can have a true value. For example, if I have an urn with 30 black balls and 70 white balls, and I pull a ball from the urn, I'll get a black ball about 30 times out of 100. This isn't "because the true probability is 30%" - that's an explanation that just points to a new fundamental property to explain. It's because the urn is 30% black balls, and I hadn't looked at where all the balls were yet.
Using probabilities is an admission of ignorance, of incomplete information. You don't assign the coin a probability because it's magically probabilistic, you use probabilities because you haven't looked at the coin yet. There's no "true probability" sitting out there in the world waiting for you to discover it, there's only a coin that's either heads or tails. And sometimes there are urns with different mixtures of balls, though of course if you can look inside the urn it's easy to pick the ball you want.
Part 2:
Okay, so there's no "externally objective, realio trulio probability" to compare our priors to, so how about asking how much our probability will move after we get the next bit of information?
Let's use some examples. Say I'm taking a poll. And I want to know what the probability is that people will vote for the Purple Party. So I ask 10 people. Now, 10 is a pretty small sample size, but say 3 out of 10 will vote for the purple party. So I estimate that the probability is a little more than 3/10. Now, the next additional person I ask will cause me to change my probability by about 10% of its current value. But after I poll 1000 people, asking the next person barely changes my probability estimate. Stability!
This actually works pretty well.
If you wanted to split up your hypothesis space about the poll results into mutually exclusive and exhaustive pieces (which is generally a good idea), you would have a million different hypotheses, because there are a million (well, 1,000,001) different possible numbers of Purple Party supporters. So for example there would be separate hypotheses for 300,000 Purple Party supporters vs. 300,001. Giving each of these hypotheses their own probability is sufficient to talk about the kind of stability you want. If the probabilities are concentrated on a few possible numbers, then your poll is really stable.
And a good thing that it works out, because the probabilities of those million hypotheses are all of the information you have about this poll!
Note that this happens without any mention of "true probability." We chose those million hypotheses because there are realio trulio a million different possible answers. A narrow distribution over these hypotheses represents certainty not about some true probability, but about the number of actual people out in the actual world, wearing actual purple.
So thank goodness a probability distribution over the external possibilities is all ya' need, because it's all ya' got in this case.
Thanks, the "true probability" phrasing was misleading, I should've reread my comment before submitting. Probability is in the mind etc., what I referred to was "the probability you'd eventually end up with, having incorporated all relevant information, the limit", which is still in your mind, but as close to "true" as you'll get.
So you can of course say Pr(Box is empty | I saw it's empty) = x and Pr(Box is empty | I saw it's empty and I got to examine its inner surfaces with my hand) = y, then list all similar hypothesis abou... (read more)