A sufficiently skeptical position is completely immune to criticism, or to any other form of argument. I don't see what anyone could hope to do about that, beyond not bothering arguing with people who profess such extreme skepticism.
(I am reminded of a little fable I think I saw in an old OB post. Human space travellers encounter an alien planet whose inhabitants have adopted an anti-inductive principle, with the unsurprising result that pretty much everything they do is miserably unsuccessful. The humans ask them "So why do you keep on doing this?" and they say "Well, it's never worked for us before...")
Are you familiar with Sextus Empiricus? If you like intransigent skepticism, you'll love him. And the SEP just published a new entry on him! While you are at it, you might want to look at this entry on a priori justification.
You are trying to answer Descartes' Evil Daemon argument. That is futile, because the whole point of the argument is to be unbeatable. But suppose you did come up with an argument against it; I can always come up with an even stronger "daemon" or whatnot that can defeat the argument. (There's always the classic "How d
This now has 90 comments, including 2 of my own. None of them are particularly enlightening, IMO, but this is still evidence that it's interesting or amusing. As such, I've retracted my downvote.
If the point is we can't derive the validity of probability from nothing, congratulations. You have rediscovered something significantly less useful than the wheel or fire.
So if you can't derive the validity of probability from nothing, what can you do?
1) Walk around in a self-induced fog of feigned ignorance, having slipped the fact that you have put logical derivation on a throne dictating "truth" without ever having questioned that operation. You certainly can't derive from nothing that logical derivation is the only source of truth.
2) Loo...
how are said axioms to be justified?
This is how I'd answer a sceptic:
If I put two apples into a bag that previously had two apples, I can take four apples out of the bag. Thus, I believe that axioms on which basic arithmetic is based are "justified". By the same token I believe axioms of probability and I'm pretty sure you see a close approximation of a "fair coin" on a daily basis, not to mention more complex behaviors which probability theory predicts very well. If after that you're still skeptic of the correlation, I expect you to...
Why should I take the skeptic seriously?
I cannot picture how I would live my life without coping with uncertainty. And I know that probability follows from various plausible axiomatizations of uncertainty. (E.g, Cox's theorem)
This makes me suspect strongly that the skeptic is playing terminological games, since there's no actual substantive thing I could do differently if they convinced me.
Belief in the axioms of probability theory is justified by the fact that someone with inconsistent beliefs can be Dutch-booked.
If you're willing to put money on your beliefs (i.e. bet on them), then you ought to believe in the axioms in the first place, otherwise your opponent will always be able to come up with a combination of bets that will cause you to lose money.
This fact was proved by Bruno de Finetti in 1930-ties. See e.g. AI: A Modern Approach for an easily approachable technical discussion.
how you answer a skeptic about ... reality
The standard answer is a punch in the nose. I have yet to meet a claimant to skepticism willing to let me perform this experiment enough times to get a trustworthy result.
Lighter-weight skeptics (those willing to at least tentatively accept some postulates about reality being real, and the validity of predicting future experiences) generally have no problem with "I can't justify these from first principles, but I'm using them until I can think of better".
"the question isn’t how to arrive at the Truth, but rather how to eliminate error. Which sounds kind of obvious, until I meet yet another person who rails to me about how empirical positivism can’t provide its own ultimate justification, and should therefore be replaced by the person’s favorite brand of cringe-inducing ugh." -- Scott Aaronson
what is useful to believe and what is true have no necessary correlation
You seem to be referring to the distinction between instrumental and epistemic rationality. Yes, they are different things. The case I am trying to make does not depend on a conflation of the two, and works just fine if we confine ourselves to epistemic rationality, as I will attempt to show below.
OK, so I think your labeling system, which is clearly different from the one to which I am accustomed, looks like this:
rationality = a set of rules which reliably and necessarily determine truth
and
X is irrational = X does not follow rationality
If that's how you want to use the labels in this thread, fine. But it seems that an agent that believed only things that were known with infinite certainty would suffer from a severe truth deficiency. Even if such an agent managed to avoid directly accepting any falsehoods, she would fail to accept a vast number of correct beliefs. This is because much of the world is knowable--just not with absolute certainty. She would not have a very accurate picture of the world.
And this is not just because of "pragmatics"; even if the only goal is to maximize true beliefs, it makes no sense to filter out every non-provable proposition, because doing so would block too many true beliefs.
Perhaps an analogy with nutrition would be helpful. Imagine a person who refused to ingest anything that wasn't first totally proven to be nutritious. Whenever she was served anything (even if she had eaten the same thing hundreds of times before!), she had to subject it to a series of time-consuming, expensive, and painstaking tests.
Would this be a good idea, from a nutritional point of view? No. For one thing, it would take way too long--possibly forever. And secondly (and this is the aspect I'm trying to focus on) lots of nutritious things cannot be proven so. Is this bite of pasta going to be nutritious? What about the next one? And the one after that? A person who insisted on such a diet would not eat very nutrients at all, because so many things would not pass the test ( and because the person would spend so much time testing and so little time eating).
Now, how about a person's epistemic diet--does it make sense, from a purely epistemic perspective, for an agent to believe only what she can prove with absolute certainty? No. For one thing, it would take way too long--possibly forever. And secondly, lots of true things cannot be proven so, at least not with the kind of transcendent certainty you seem to be talking about. So an agent who insisted on such a filter would end up blocking much truth, thus "learning" a highly distorted map.
If the agent is interested in truth, she should ditch that filter and find a standard that lets her accept more true correct claims about the world, even if they aren't totally proven.
By the way, have you read many of the Sequences? They are quite helpful and much better written than my comments. I'd say to start here. This one and this one also heavily impinge on our topic.
This assumes what the entire thread is about- that probability is a legitimate means for discussing reality. This presumes a lot or axioms of probability, such as that if you see X it is more likely real than an illusion, and induction as valid.
The appeal to absence of many true beliefs is irrelevant, as you have no means to determine truth beyond skepticism.
I've raised arguments for philosophical scepticism before, which have mostly been argued against in a Popper-esque manner of arguing that even if we don't know anything with certainty, we can have legitimate knowledge on probabilities.
The problem with this, however, is how you answer a sceptic about the notion of probability having a correlation with reality. Probability depends upon axioms of probability- how are said axioms to be justified? It can't be by definition, or it has no correlation to reality.