Tyrrell_McAllister comments on How to Convince Me That 2 + 2 = 3 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (381)
That you have certain mathematical beliefs has a lot to do with the experiences that you have had. This applies in particular to your beliefs about what the theorems of PA are.
Sorry, I edited the statement in question right before you posted that because I anticipated a similar reaction. However, you're still wrong. It has only to do with my beliefs to what extent peano arithmetic applies to reality, which is something completely different.
Edit: Ok, you're probably not wrong, but it rather seems we are talking about different things when we say "mathematical beliefs". Whether peano arithmetic applies to reality is not a mathematical belief for me.
And another thing: It might be possible that if peano arithmetic didn't apply to reality I wouldn't have any beliefs about peano arithmetic because I might not even think of it. However there is no way I could establish the peano axioms and then believe that SS0 + SS0 = SSS0 is true within peano arithmetic. It's just not possible.
Consider the experiences that you have had while reading and thinking about proofs within PA. (The experience of devising and confirming a proof is just a particular kind of experience, after all.) Are you saying that the contents of those experiences have had nothing to do with the beliefs that you have formed about what the theorems of PA are?
Suppose that those experiences had been systematically different in a certain way. Say that you consistently made a certain kind of mistake while confirming PA proofs, so that certain proofs seemed to be valid to you that don't seem valid to you in reality. Would you not have arrived at different beliefs about what the theorems of PA are?
That is the sense in which your beliefs about what the theorems of PA are depend on your experiences.
I'm not sure I 100% understand what you're saying, but the question "which beliefs will I end up with if logical reasoning itself is flawed" is of little interest to me.
Is the question "Which beliefs will I end up with if my faculty of logical reasoning is flawed" also of little interest to you?
Yes, because if I assume that my faculty of logical reasoning is flawed, no deductions of logical reasoning I do can be considered certain, in which case everything falls: Mathematics, physics, bayesianism, you name it. It is therefore (haha! but what if my faculty of logical reasoning is flawed?) very irrational to assume this.
But you know that your faculty of logical reasoning is flawed to some extent. Humans are not perfect logicians. We manage to find use in making long chains of logical deductions even though we know that they contain mistakes with some nonzero probability.
I don't know that. Can you prove that under the assumption you're making?
As I see it, my faculty of logical reasoning is not flawed in any way. The only thing that's flawed is my faculty of doing logical reasoning, i.e. I'm not always doing logical reasoning when I should be. But that's hardly the matter here.
I would be very interested in how you can come to any conclusion under the assumption that the logical reasoning you do to come to that conclusion is flawed. If my faculty of logical reasoning is flawed, I can only say one thing with certainty, which is that my faculty of logical reasoning is flawed. Actually, I don't think I could even say that.
Edit:
I don't consider this to be a problem of actual faculty of logical reasoning because if someone finds a logical mistake I will agree with them.
Sorry for not being clear. By "faculty of logical reasoning", I mean nothing other than "faculty of doing logical reasoning".
In that case I have probably answered your original question here.
So you don't consider mistakes in logical reasoning a problem because someone might point them out to you? What if it's an easy mistake to make, and a lot of other people make the same mistake? At this point, it seems like you're arguing about the definition of the words "problem with", not about states of the world. Can you clarify what disagreement you have about states of the world?
I don't consider these mistakes to be no problem at all. What I meant to say is that the existence of these noise errors doesn't reduce the reasonabliness of me going around and using logical reasoning to draw deductions. Which also means that if reality seems to contradict my deductions, then either there is an error within my deductions that I can theoretically find, or there is an error within the line of thought that made me doubt my deductions, for example eyes being inadequate tools for counting pebbles. To put it more generally: If I don't find errors within my deductions, then my perception of reality is not an appropriate measure for the truth of my deductions, unless said deductions deal in any way with the applicability of other deductions on reality, or reality in general, which mathematics does not.
It's not as if errors in perceiving reality weren't much more numerous and harder to detect than errors in anyone's faculty of doing logical reasoning.
I think the point is that mathematical reasoning is inherently self-correcting in this sense, and that this corrective force is intentionistic and Lamarckian - it is being corrected toward a mathematical argument which one thinks of as a timeless perfect Form (because come on, are there really any mathematicians who don't, secretly, believe in the Platonic realism of mathematics?), and not just away from an argument that's flawed.
An incorrect theory can appear to be supported by experimental results (with probability going to 0 as the sample size goes to \infty), and if you have the finite set of experimental results pointing to the wrong conclusion, then no amount of mind-internal examination of those results can correct the error (if it could, your theory would not be predictive; conservation of probability, you all know that). But mind-internal examination of a mathematical argument, without any further entangling (so no new information, in the Bayesian sense, about the outside world; only new information about the world inside your head), can discover the error, and once it has done so, it is typically a mechanical process to verify that the error is indeed an error and that the correction has indeed corrected that error.
This remains true if the error is an error of omission (We haven't found the proof that T, so we don't know that T, but in fact there is a proof of T).
So you're not getting new bits from observed reality, yet you're making new discoveries and overthrowing past mistakes. The bits are coming from the processing; your ignorance has decreased by computation without the acquisition of bits by entangling with the world. That's why deductive knowledge is categorically different, and why errors in logical reasoning are not a problem with the idea of logical reasoning itself, nor do they exclude a mathematical statement from being unconditionally true. They just exclude the possibility of unconditional knowledge.
Can you conceive of a world in which, say, ⋀∅ is false? It's certainly a lot harder than conceiving of a world in which earplugs obey "2+2=3"-arithmetic, but is your belief that ⋀∅ unconditional? What is the absolutely most fundamentally obvious tautology you can think of, and is your belief in it unconditional? If not, what kind of evidence could there be against it? It seems to me that ¬⋀∅ would require "there exists a false proposition which is an element of the empty set"; in order to make an error there I'd have to have made an error in looking up a definition, in which case I'm not really talking about ⋀∅ when I assert its truth; nonetheless the thing I am talking about is a tautological truth and so one still exists (I may have gained or lost a 'box', here, in which case things don't work).
My mind is beginning to melt and I think I've drifted off topic a little. I should go to bed. (Sorry for rambling)