If you're going to write a book hundreds of pages long in which you crucially rely on the concept of complexity, you need to explicitly to define it. That's just how it works. If you know what concept of complexity is "the" right one here, you need to spell it out yourself.
Well, Silas, what I actually did was write a book 255 pages long of which this whole Dawkins/complexity thing occupies about five pages (29-34) and where complexity is touched on exactly once more, in a brief passage on pages 7-8. From the discrepancy between your description and reality, I infer that you haven't read the book, which would help to explain why your comments are so bizarrely misdirected.
Oh, and I see that you're still going on about axiomatic descriptions of squirrels, as if that were relevant to something I'd said. (Hint: A simulation is not an axiomatic system. That's 48 bajillion and one.)
Splat:
1)
The problem you encounter here is that these substructures and near-substructures, once they reach a certain size, actually require more information to specify than N itself.
This depends on what you mean by "specify". To distinguish N from other mathematical structures requires either an infinite (indeed non-recursive) amount of information or a second order specification including some phrase like "all predicates". Are you referring to the latter? Or to something else I don't know about?
2) I do not know Chaitin's definition of the K-complexity of a structure. I'll try tracking it down, though if it's easy for you to post a quick definition, I'll be grateful. (I do think I know how to define the K-complexity of a theory.) I presume that if I knew this, I'd know your answer to question 1).
3) Whatever the definition, the question remains whether K-complexity is the right concept here. Dawkins's argument does not define complexity; he treats it as "we know it when we see it". My assertion has been that Dawkins's argument applies in a context where it leads to an incorrect conclusion, and therefore can't be right. To make this argument, I need to use Dawkins's intended notion of complexity, which might not be the same as Chaitin's or Kolmogorov's. And for this, the best I can do is to infer from context what Dawkins does and does not see as complex. (It is, clear from context that he sees complexity as a general phenomenon, not just a biological one.)
4) The natural numbers are certainly an extremely complex structure in the everyday sense of the word; after thousands of years of study, people are learning new and surprising things about them every day, and there is no expectation that we've even scratched the surface. This is, of course, a manifestation of the "wildly nonrecursive" nature of T(N), all of which is reflected in N itself. And this, again, seems pretty close to the way Dawkins uses the word.
5) I continue to be most grateful for your input. I see that SIlas is back to insisting that you can't simulate a squirrel with a simple list of axioms, after having been told forty eight bajillion times (here and elsewhere) that nobody's asserting any such thing; my claim is that you can simulate a squirrel in the structure N, not in any particular axiomatic system. Whether or not you agree, it's a pleasure to engage with someone who's not obsessed with pummelling straw men.
Splat: Thanks for this; it's enlightening and useful.
The part I'm not convinced of this:
to simulate any biological creature, you need N plus a bunch of biological information
A squirrel is a finite structure; it can be specified by a sequence of A's, C's, G's and T's, plus some rules for protein synthesis and a finite number of other facts about chemistry. (Or if you think that leaves something out, it can be described by the interactions among a large but finite collection of atoms.) So I don't see where we need all of N to simulate a squirrel.
Bo102010: Thanks for the kind words. I'm not sure what the community standards are here, but I hope its not inappopriate to mention that I post to my own blog almost every weekday, and of course I'll be glad to have you visit.
The only error is to refuse to "cash out" the meaning of "arithmetic" into well-defined >predictions, but instead keep it boxed up into one ambiguous term,
Silas: This is really quite frustrating. I keep telling you exactly what I mean by arithmetic (the standard model of the natural numbers); I keep using the word to mean this and only this, and you keep claiming that my use of the word is either ambiguous or inconsistent. It makes it hard to imagine that you're actually reading before you're responding, and it makes it very difficult to carry on a dialogue. So for that reason, I think I'll stop here.
Eliezer: There are an infinite number of truths of euclidean geometry. How could our finite brains know them all?
This was not meant to be a profound observation; it was meant to correct Silas, who seemed to think that I was reading some deep significance into our inability to know all the truths of arithmetic. My point was that there are lots of things we can't know all the truths about, and this was therefore not the feature of arithmetic I was pointing to.
Silas: I agree that if arithmetic is a human invention, then my counterexample goes away.
If I've read you correctly, you believe that arithmetic is a human invention, and therefore reject the counterexample.
On that reading, a key locus of our disagreement is whether arithmetic is a human invention. I think the answer is clearly no, for reasons I've written about so extensively that I'd rather not rehash them here.
I'm not sure, though, that I've read you correctly, because you occasionally say things like "The Map Is Not The Territory" which seems to presuppose some sort of platonic Territory. But maybe I just don't understand what you meant by this phrase.
[Incidentally, it occurs to me that perhaps you are misreading my use of the word "model". I am using this word in the technical sense that it's used by logicians, not in any of its everyday senses.]
mattnewport: This would seem to put you in the opposite corner from Silas, who thinks (if I read him correctly) that all of physical reality is computably describable, and hence far simpler than arithmetic (in the sense of being describable using only a small and relatively simple fragment of arithmetic).
Be that as it may, I've blogged quite a bit about the nature of the complexity of arithmetic (see an old post called "Non-Simple Arithmetic" on my blog). In brief: a) no set of axioms suffices to specify the standard model of arithmetic (i.e. to distinguish it from other models). And b) we have the subjective reports of mathematicians about the complexity of their subject matter, which I think should be given at least as much weight as the subjective reports of ecologists. (There are a c), d) and e) as well, but in this short comment, I'll rest my case here.)
Silas:
First---I have never shifted meanings on the definition of arithmetic. Arithmetic means the standard model of the natural numbers. I believe I've been quite consistent about this.
Second---as I've said many times, I believe that the most plausible candidates for the "fabric of the Universe" are mathematical structures like arithmetic. And as I've said many times, obviously I can't prove this. The best I can do is explain why I find it so plausible, which I've tried to do in my book. If those arguments don't move you, well, so be it. I've never claimed they were definitive.
Third--you seem to think (unless I've misread you) that this vision of the Universe is crucial to my point about Dawkins. It's not.
Fourth---Here is my point about Dawkins; it would be helpful to know which part(s) you consider the locus of our disagreement:
a) the natural numbers---whether or not you buy my vision of them as the basis of reality---are highly complex by any reasonable definition (I am talking here about the actual standard model of the natural numbers, not some axiomatic system that partly describes them);
b) Dawkins has said, repeatedly, that all complexity---not just physical complexity, not just biological complexity, but all complexity---must evolve from something simpler. And indeed, his argument needs this statement in all its generality, because his argument makes no special assumption that would restrict us to physics or biology. It's an argument about the nature of complexity itself.
c) Therefore, if we buy Dawkins's argument, we must conclude that the natural numbers evolved from something simpler.
d) The natural numbers did not evolve from something simpler. Therefore Dawkins's argument can't be right.
Splat:
Thanks again for bringing insight and sanity to this discussion. A few points:
1) Your description of the structure N presupposes some knowledge of the structure N; the program that prints out the structure needs a first statement, a second statement, etc. This is, of course, unavoidable, and it's therefore not a complaint; I doubt that there's any way to give a formal description of the natural numbers without presupposing some informal understanding of the natural numbers. But what it does mean, I think, is that K-complexity (in the sense that you're using it) is surely the wrong measure of complexity here---because when you say that N has low K-complexity, what you're really saying is that "N is easy to describe provided you already know something about N". What we really want to know is how much complexity is imbedded in that prior knowledge.
1A) On the other hand, I'm not clear on how much of the structure of N is necessarily assumed in any formal description, so my point 1) might be weaker than I've made it out to be.
2) It has been my position all along that K-complexity is largely a red herring here in the sense that it need not capture Dawkins's meaning. Your observation that a pot of boiling water is more K-complex than a squirrel speaks directly to this point, and I will probably steal it for use in future discussions.
3) When you talk about T(N), I presume you mean the language of Peano arithmetic, together with the set of all true statements in that language. (Correct me if I'm wrong.) I would hesitate to call this a theory, because it's not recursively axiomatizable, but that's a quibble. In any event, we do know what we mean by T(N), but we don't know what we mean by T(squirrel) until we specify a language for talking about squirrels---a set of constant symbols corresponding to tail, head, etc., or one for each atom, or....., and various relations, etc. So T(N) is well defined, while T(squirrel) is not. But whatever language you settle on, a squirrel is still going to be a finite structure, so T(squirrel) is not going to share the "wild nonrecursiveness" of T(N) (which is closely related to the difficulty of giving an extrinsic characterization). That seems to me to capture a large part of the intuition that the natural numbers are more complex than a squirrel,
4) You are probably right that Dawkins wasn't thinking about mathematical structures when he made his argument. But because he does claim that his argument applies to complexity in general, not just to specific instances, he's stuck (I think) either accepting applications he hadn't thought about or backing off the generality of his claim. It's of course hard to know exactly what he meant by complexity, but it's hard for me to imagine any possible meaning consistent with Dawkins's usage that doesn't make arithmetic (literally) infinitely more complex than a squirrel.
5) Thanks for trying to explain to Silas that he doesn't understand the difference between a structure and an axiomatic system. I've tried explaining it to him in many ways, at many times, in many forums, but have failed to make any headway. Maybe you'll have better luck.
6) If any of this seems wrong to you, I'll be glad to be set straight.