RichardKennaway comments on Rationality Quotes: February 2010 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (322)
Then what are Dawkins and his opponents "equally wrong" about? What does it mean to say that complexity is "inherent in the laws of nature"? Or that it isn't? What does Landsburg mean by "complexity"? Is arithmetic "complex" because it contains deep truths, or is it "simple" because it can be captured in a small set of axioms?
I have yet to understand what is being claimed here.
RichardKennaway:
Ariithmetic is complex because it can not be captured in a small set of axioms. More precisely, it cannot be specified by any (small or large) set of axioms, because any set of (true) axioms about arithmetic applies equally well to other structures that are not arithmetic. Your favorite set of axioms fails to specify arithmetic in the same way that the statement "bricks are rectangular" fails to specify bricks; there are lots of other things that are also rectangular.
This is not true, for example, of euclidean geometry, which can be specified by a set of axioms.
Silas Barta's remarks notwithstanding, the question of which truths we can know has nothing to do with this; we can never know all the truths of euclidean geometry, but we can still specify euclidean geometry via a set of axioms. Not so for arithmetic.
Here we go again.
Then the universe doesn't use that arithmetic in implementing physics, and it doesn't have the significance you claim it does. Like I said just above, it uses the kind of arithmetic that can be captured in a small set of axioms. And like I said in our many exchanges, it's true that modern computers can't answer every question about the natural numbers, but they don't need to. Neither does the universe.
Yes, but you only need finite space to specify bricks well enough to get the desired functionality of bricks. Your argument would imply that bricks are infinitely complex because we don't have a finite procedure for determining where an arbitrary object "really" is a brick, because of e.g. all the borderline cases. ("Do the stones in a stone wall count as bricks?")
<i>Then the universe doesn't use that arithmetic in implementing physics,</i>
How do you know?
<i>Like I said just above, it uses the kind of arithmetic that can be captured in a small set of axioms. </i>
What kind of arithmetic is that? It would have to be a kind of arithmetic to which Godel's and Tarski's theorems don't apply, so it must be very different indeed from any arithmetic I've ever heard of.
Mainly from the computability of the laws of physics.
Right -- meaning the universe doesn't use arithmetic (as you've defined it). You're getting tripped up on the symbol "arithmetic", for which you keep shifting meanings. Just focus on the substance of what you mean by arithmetic: Does the universe need that to work? No, it does not. Do computers need to completely specify that arithmetic to work? No, they do not.
By the way:
1) To quote someone here, use the greater-than symbol before the quoted paragraph, as described in the help link below the entry field for a comment.
2) One should be cautious about modding down someone one is a direct argument with, as that tends to compromise one's judgment. I have not voted you down, though if I were a bystander to this, I would.
Silas:
First---I have never shifted meanings on the definition of arithmetic. Arithmetic means the standard model of the natural numbers. I believe I've been quite consistent about this.
Second---as I've said many times, I believe that the most plausible candidates for the "fabric of the Universe" are mathematical structures like arithmetic. And as I've said many times, obviously I can't prove this. The best I can do is explain why I find it so plausible, which I've tried to do in my book. If those arguments don't move you, well, so be it. I've never claimed they were definitive.
Third--you seem to think (unless I've misread you) that this vision of the Universe is crucial to my point about Dawkins. It's not.
Fourth---Here is my point about Dawkins; it would be helpful to know which part(s) you consider the locus of our disagreement:
a) the natural numbers---whether or not you buy my vision of them as the basis of reality---are highly complex by any reasonable definition (I am talking here about the actual standard model of the natural numbers, not some axiomatic system that partly describes them);
b) Dawkins has said, repeatedly, that all complexity---not just physical complexity, not just biological complexity, but all complexity---must evolve from something simpler. And indeed, his argument needs this statement in all its generality, because his argument makes no special assumption that would restrict us to physics or biology. It's an argument about the nature of complexity itself.
c) Therefore, if we buy Dawkins's argument, we must conclude that the natural numbers evolved from something simpler.
d) The natural numbers did not evolve from something simpler. Therefore Dawkins's argument can't be right.
It seems to me that the definition of complexity is the root of any disagreement here. It seems obvious to me that the natural numbers are not complex in the sense that a human being is complex. I don't understand what kind of complexity you could be talking about that places natural numbers on an equivalent footing with, say, the entire ecosystem of the planet Earth.
mattnewport: This would seem to put you in the opposite corner from Silas, who thinks (if I read him correctly) that all of physical reality is computably describable, and hence far simpler than arithmetic (in the sense of being describable using only a small and relatively simple fragment of arithmetic).
Be that as it may, I've blogged quite a bit about the nature of the complexity of arithmetic (see an old post called "Non-Simple Arithmetic" on my blog). In brief: a) no set of axioms suffices to specify the standard model of arithmetic (i.e. to distinguish it from other models). And b) we have the subjective reports of mathematicians about the complexity of their subject matter, which I think should be given at least as much weight as the subjective reports of ecologists. (There are a c), d) and e) as well, but in this short comment, I'll rest my case here.)
Your biggest problem here, and in your blog posts, is that you equivocate between the structure of the standard natural numbers (N) and the theory of that structure (T(N), also known as True Arithmetic). The former is recursive and (a reasonable encoding of) it has pretty low Kolmogorov complexity. The latter is wildly nonrecursive and has infinite K-complexity. (See almost any of Chaitin's work on algorithmic information theory, especially the Omega papers, for definitions of the K-complexity of a formal system.)
The difference between these two structures comes from the process of translating between them. Once explained properly, it's almost intuitive to a recursion theorist, or a computer scientist versed in logic, that there's a computable reduction from any language in the Arithmetic Hierarchy to the language of true statements of True Arithmetic. This implies that going from a description of N to a truth-enumerator or decision procedure for T(N) requires a hypercomputer with an infinite tower of halting, meta-halting, ... meta^n-halting ... oracles.
However, it so happens that simulating the physical world (or rather, our best physical 'theories', which in a mathematical sense are structures, not theories) on a Turing machine does not actually require T(N), only N. We only use theories, as opposed to models, of arithmetic, when we go to actually reason from our description of physics to consequences. And any such reasoning we actually do, just like any pure mathematical reasoning we do, depends only on a finite-complexity fragment of T(N).
Now, how does this make biology more complex than arithmetic? Well, to simulate any biological creature, you need N plus a bunch of biological information, which together has more K-complexity than just N. To REASON about the biological creature, at any particular level of enlightenment, requires some finite fragment of T(N), plus that extra biological information. To enumerate all true statements about the creature (including deeply-alternating quantified statements about its counterfactual behaviour in every possible circumstance), you require the infinite information in T(N), plus, again, that extra biological information. (In the last case it's of course rather problematic to say there's more complexity there, but there's certainly at least as much.)
Note that I didn't know all this this morning until I read your blog argument with Silas and Snorri; I thank all three of you for a discussion that greatly clarified my grasp on the levels of abstraction in play here.
(This morning I would have argued strongly against your Platonism as well; tonight I'm not so sure...)
Splat: Thanks for this; it's enlightening and useful.
The part I'm not convinced of this:
A squirrel is a finite structure; it can be specified by a sequence of A's, C's, G's and T's, plus some rules for protein synthesis and a finite number of other facts about chemistry. (Or if you think that leaves something out, it can be described by the interactions among a large but finite collection of atoms.) So I don't see where we need all of N to simulate a squirrel.
Again, this word complexity is used in many ways. Complexity in the sense of humans find this complicated is a different concept from complexity in the sense of Kolmogorov complexity.
Don't worry guys, I didn't let you down. I addressed the issue from the perspective of Kolmogorv complexity in my first blog response. Landsburg initially replied with (I'm paraphrasing), "so what if you became an expert on information theory? That's not the only meaning of complexity."
Only later did he try to claim that he also meets the Kolmogorov definition.
(And FWIW, I'm not an expert on information theory -- it's just a hobby. I guess my knowledge just looked impressive to someone...)
Then what do you mean when you say "integers"^H^H "natural numbers", if no set of premises suffices to talk about it as opposed to something else?
Anyway, no countable set of first-order axioms works. But a finite set of second-order axioms work. So to talk about the natural numbers, it suffices merely to think that when you say "Any predicate that is true of zero, and is true of the successor of every number it is true of, is true of all natural numbers" you made sense when you said "any predicate".
It is this sort of minor-seeming yet important technical inaccuracy that separates "The Big Questions" from "Good and Real", I'm afraid.
Natural numbers, rather. (Minor typo.)
I think that you have to be careful about claims that second-order logic fixes a unique model. Granted, you can derive the statement "There exists a unique model of the axioms of arithmetic."
But, for example, what in reality does your "any predicate" quantifier range over? If, under interpretation, it ranges over subsets of the domain of discourse, well, what exactly constitutes a subset? This presumes that you have a model of some set theory in hand. How do you specify which model of set theory you're using? So far as I know, there's no way out of this regress.
[ETA: I'm not a logician. I'm definitely open to correction here.]
[ETA2: And now that I read more carefully, you were acknowledging this point when you wrote, "it suffices merely to think that . . . you made sense when you said 'any predicate'."
However, you didn't acknowledge this issue in your earlier comment. I think that it's too significant an issue to be dismissed with an "it suffices merely...". When an infinite regress threatens, it doesn't suffice to push the issue back a level and say "it suffices merely to show that that's the last level."]
No, it wouldn't -- he's saying basically the same thing I did. The laws of physics are computable. In describing observations, we use concepts from math. The reason we do so is that it allows simpler descriptions of the universe.
I think the system of natural numbers is pretty damn complex. But the system of natural numbers is an abstract object and Dawkins likely never meant for his argument to apply to abstract objects, thinks all abstract objects are constructed by intelligences or denies the existence of abstract objects.
I think there is a good chance all abstract objects are constructed and a better chance that the system of natural numbers was constructed (or at least the system, when construed as an object and not a structural analog, is constructed and not discovered. That is numbers are more like adjectives then nouns, adjectives aren't objects.)
Contrary to what SteveLandsburg says in his reply, I think you are exactly right. And this is how our disagreement originally started, by me explaining why he's wrong about complexity.
Scientists use math to compress our description of the universe. It wouldn't make much sense to use something infinitely complex for data compression!
So, to the extent he's talking about math or arithmetic in a way that does have such complexity, he's talking about something that isn't particularly relevant to our universe.
Right, I've explained before why your arguments are in error. We can talk more about that some other time.
No, I accept that they're separate errors.
Okay:
If what you describe here is what you mean by both "the natural numbers" and "the actual standard model of the natural numbers", then I will accept this definition for the purposes of argument, but that, using it consistently, it doesn't have the properties you claim.
Disagree with this. Dawkins has been referring to existing complexity in the universe and the context of every related statement confirms this. But even accepting it, the rest of your argument still doesn't follow.
Disagree. Again, let's keep the same definition throughout. Recall what you said the natural numbers were:
The model arose from something simpler (like basic human cognition of counting of objects). The Map Is Not The Territory.
Ah, but now I know what you're going to say: you meant the sort of Platonic-space model of those natural numbers, that exists independently of whatever's in our universe, has always been complex.
So, if you assume (like theists) that there's some sort of really-existing realm, outside of the universe, that always has been, and is complex, then you can prove that ... there's a complexity that has always existed. Which is circular.
Silas: I agree that if arithmetic is a human invention, then my counterexample goes away.
If I've read you correctly, you believe that arithmetic is a human invention, and therefore reject the counterexample.
On that reading, a key locus of our disagreement is whether arithmetic is a human invention. I think the answer is clearly no, for reasons I've written about so extensively that I'd rather not rehash them here.
I'm not sure, though, that I've read you correctly, because you occasionally say things like "The Map Is Not The Territory" which seems to presuppose some sort of platonic Territory. But maybe I just don't understand what you meant by this phrase.
[Incidentally, it occurs to me that perhaps you are misreading my use of the word "model". I am using this word in the technical sense that it's used by logicians, not in any of its everyday senses.]
Map and territory
More: Map and Territory (sequence)
Then you agree that your "counterexample" amounts to an assumption. If a Platonic realm exists (in some appropriate sense), and if Dawkins was haphazardly including that sense in the universe he is talking about when he describes complexity arising, then he wrong that complexity always comes from simplicity.
If you assume Dawkins is wrong, he's wrong. Was that supposed to be insightful?
It's a false dispute, though. When you clarify the substance of what these terms mean, there are meanings for which we agree, and meanings for which we don't. The only error is to refuse to "cash out" the meaning of "arithmetic" into well-defined predictions, but instead keep it boxed up into one ambiguous term, which you do here, and which you did for complexity. (And it's kind of strange to speak for hundreds of pages about complexity, and then claim insights on it, without stating your definition anywhere.)
One way we'd agree, for example, is if we take your statements about the Platonic realm to be counterfactual claims about phenomena isomorphic to certain mathematic formalisms (as I said at the beginning of the thread).
The definitions aren't incredibly different, which is why we have the same term for both of them. If you spell out that definition more explicitly, the same problems arise, or different ones will pop up.
(By the way, this doesn't surprise me. This is the fourth time you've had to define a term within a definition you gave in order to avoid being wrong. It doesn't mean you changed that "subdefinition". But genuine insights about the world don't look this contorted, where you have to keep saying, "No, I really meant this when I was saying what I meant by that.")
Silas: This is really quite frustrating. I keep telling you exactly what I mean by arithmetic (the standard model of the natural numbers); I keep using the word to mean this and only this, and you keep claiming that my use of the word is either ambiguous or inconsistent. It makes it hard to imagine that you're actually reading before you're responding, and it makes it very difficult to carry on a dialogue. So for that reason, I think I'll stop here.
That doesn't sound right. Can you point me to for example a Wikipedia page about this?
First-order logic can't distinguish between different sizes of infinity. Any finite or countable set of first-order statements with an infinite model has models of all sizes.
However, if you take second-order logic at face value, it's actually quite easy to uniquely specify the integers up to isomorphism. The price of this is that second-order logic is not complete - the full set of semantic implications, the theorems which follow, can't be derived by any finite set of syntactic rules.
So if you can use second-order statements - and if you can't, it's not clear how we can possibly talk about the integers - then the structure of integers, the subject matter of integers, can be compactly singled out by a small set of finite axioms. However, the implications of these axioms cannot all be printed out by any finite Turing machine.
Appropriately defined, you could state this as "finitely complex premises can yield infinitely complex conclusions" provided that the finite complexity of the premises is measured by the size of the Turing machine which prints out the axioms, yielding is defined as semantic implication (that which is true in all models of which the axioms are true), and the infinite complexity of the conclusions is defined by the nonexistence of any finite Turing machine which prints them all.
However this is not at all the sort of thing that Dawkins is talking about when he talks about evolution starting simple and yielding complexity. That's a different sense of complexity and a different sense of yielding.
That makes more sense, thanks.
Any recommended reading on this sort of thing?
Decidability of Euclidean geometry.
I don't know where Landsburg gets the claim that we can know all the truths of arithmetic.
Richard Kennaway:
<i>I don't know where Landsburg gets the claim that we can know all the truths of arithmetic.</i>
I don't know where you got the idea that I'd ever make such a silly claim.
I misinterpreted this: "we can never know all the truths of euclidean geometry, but we can still specify euclidean geometry via a set of axioms. Not so for arithmetic."
Richard: Gotcha. Sorry if it was unclear which part the "not so" referred to.
Note that Landsburg is thus also incorrect in saying "we can never know all the truths of euclidean geometry".
Eliezer: There are an infinite number of truths of euclidean geometry. How could our finite brains know them all?
This was not meant to be a profound observation; it was meant to correct Silas, who seemed to think that I was reading some deep significance into our inability to know all the truths of arithmetic. My point was that there are lots of things we can't know all the truths about, and this was therefore not the feature of arithmetic I was pointing to.
A decision procedure is a finite specification of all truths of euclidean geometry; I can use that finite fact anywhere I could use any truth of geometry. I suppose there is a difference, but even so, it's the wrong thing to say in a Godelian discussion.
Yes, it was. When I and several others pointed out that arithmetic isn't actually complex, you responded by saying that it is infinitely complex, because it can't be finitely described, because to do so ... you'd have to know all the truths.
Am I misreading that response? If so, how do you reconcile arithmetic's infinite complexity with the fact that scientists in fact use it to compress discriptions of the world? An infinitely complex entity can't help to compress your descriptions.
What is this "it"? There are some who claim that when we think about arithmetic, we are thinking about a specific model of the usual axioms for arithmetic, which appears to be your view here. Every statement of arithmetic is either true or false in that model. But what reason is there to make this claim? We cannot directly intuit the truth of arithmetical statements, or mathematicians would not have to spend so much effort on proving theorems. We may observe that we have a belief that we are indeed thinking about a definite model of the axioms, but why should we believe that belief?
To say that we intuit a thing is no more than to say we believe it but do not know why.