Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
You are viewing a comment permalink. View the original post to see all
comments and the full post content.
You are viewing a single comment's thread.
The estimated Bayesian probability has nothing to do with the coin. If it did, assigning a probability of 0.5 to one of the two possible outcomes would be necessarily incorrect, because one of the few things we know about the coin is that it's not fair.
The estimate is of our confidence in using that outcome as an answer. "How confident can I be that choosing this option will turn out to be correct?" We know that the coin is biased, but we don't know which outcome is more likely. As far as we know, then, guessing one way is as good as guessing the other.
The sides of the coin do have an actual probability associated with them, which is why it's wrong to say that one particular outcome is more likely. That's a truth statement that we can't justify with the available data. Without knowing more about the coin, we can't speak about it. We can only speak to our confidence and how justified it is with the data we know.
The assertion that uncertainty is not an aspect of reality goes far beyond what anyone can justify, and is an example of gross overconfidence in one's opinions, btw.
Pick a username and password for your Less Wrong and Less Wrong Wiki accounts. You will receive an email to verify your account.
Already have an account and just want to login?
Forgot your password?