I present here two puzzles of rationality you LessWrongers may think is worth to deal with. Maybe the first one looks more amenable to a simple solution, while the second one has called attention of a number of contemporary epistemologists (Cargile, Feldman, Harman), and does not look that simple when it comes to a solution. So, let's go to the puzzles!
Puzzle 1
At t1 I justifiably believe theorem T is true, on the basis of a complex argument I just validly reasoned from the also justified premises P1, P2 and P3.
So, in t1 I reason from premises:
(R1) P1, P2 ,P3
To the known conclusion:
(T) T is true
At t2, Ms. Math, a well known authority on the subject matter of which my reasoning and my theorem are just a part, tells me I’m wrong. She tells me the theorem is just false, and convince me of that on the basis of a valid reasoning with at least one false premise, the falsity of that premise being unknown to us.
So, in t2 I reason from premises (Reliable Math and Testimony of Math):
(RM) Ms. Math is a reliable mathematician, and an authority on the subject matter surrounding (T),
(TM) Ms. Math tells me T is false, and show to me how is that so, on the basis of a valid reasoning from F, P1, P2 and P3,
(R2) F, P1, P2 and P3
To the justified conclusion:
(~T) T is not true
It could be said by some epistemologists that (~T) defeat my previous belief (T). Is it rational for me to do this way? Am I taking the correct direction of defeat? Wouldn’t it also be rational if (~T) were defeated by (T)? Why ~(T) defeats (T), and not vice-versa? It is just because ~(T)’s justification obtained in a later time?
Puzzle 2
At t1 I know theorem T is true, on the basis of a complex argument I just validly reasoned, with known premises P1, P2 and P3. So, in t1 I reason from known premises:
(R1) P1, P2 ,P3
To the known conclusion:
(T) T is true
Besides, I also reason from known premises:
(ME) If there is any evidence against something that is true, then it is misleading evidence (evidence for something that is false)
(T) T is true
To the conclusion (anti-misleading evidence):
(AME) If there is any evidence against (T), then it is misleading evidence
At t2 the same Ms. Math tells me the same thing. So in t2 I reason from premises (Reliable Math and Testimony of Math):
(RM) Ms. Math is a reliable mathematician, and an authority on the subject matter surrounding (T),
(TM) Ms. Math tells me T is false, and show to me how is that so, on the basis of a valid reasoning from F, P1, P2 and P3,
But then I reason from::
(F*) F, RM and TM are evidence against (T), and
(AME) If there is any evidence against (T), then it is misleading evidence
To the conclusion:
(MF) F, RM and TM is misleading evidence
And then I continue to know T and I lose no knowledge, because I know/justifiably believe that the counter-evidence I just met is misleading. Is it rational for me to act this way?
I know (T) and I know (AME) in t1 on the basis of valid reasoning. Then, I am exposed to misleading evidences (Reliable Math), (Testimony of Math) and (F). The evidentialist scheme (and maybe still other schemes) support the thesis that (RM), (TM) and (F) DEFEATS my justification for (T) instead. So that whatever I inferred from (T) is no longer known. However, given my previous knowledge of (T) and (AME), I could know that (MF): F is misleading evidence. It can still be said that (RM), (TM) and (F) DEFEAT my justification for (T), given that (MF) DEFEAT my justification for (RM), (TM) and (F)?
I second Manfred's suggestion about the use of beliefs expressed as probabilities.
In puzzle (1) you essentially have a proof for T and a proof for ~T. We don't wish the order in which we're exposed to the evidence to influence us, so the correct conclusion is that you should simply be confused*. Thinking in terms of "Belief A defeats belief B" is a bit silly, because you then get situations where you're certain T is true, and the next day you're certain ~T is true, and the day after that you're certain again that T is true after all. So should beliefs defeat each other in this manner? No. Is it rational? No. Does the order in which you're exposed to evidence matter? No.
In puzzle (2) the subject is certain a proposition is true (even though he's still free to change his mind!). However, accepting contradicting evidence leads to confusion (as in puzzle 1), and to mitigate this the construct of "Misleading Evidence" is introduced that defines everything that contradicts the currently held belief as Misleading. This obviously leads to Status Quo Bias of the worst form. The "proof" that comes first automatically defeats all evidence from the future, therefore making sure that no confusion can occur. It even serves as a Universal Counterargument ("If that were true I'd believe it and I don't believe it therefore it can't be true"). This is a pure act of rationalization, not of rationality.
*) meaning that you're not completely confident of T and ~T.
Thank you, Zed. You are right: I didn't specified the meaning of 'misleading evidence'. It means evidence to believe something that is false (whether or not the cognitive agent receiving such evidence knows it is misleading). Now, maybe it I'm missing something, but I don't see any silliness in thinking of terms of "belief A defeats belief B". On the basis of having an experiential evidence, I believe there is a tree in front of me. But then, I discover I'm drugged with LCD (a friend of mine put it in my coffee previously, unknown to me). This ne... (read more)