zaph comments on Open Thread: November 2009 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (539)
Just another example of a otherwise-respectable (though not by me) economist spouting nonsense. I thought you guys might find it interesting, and it seemed short for a top-level post.
Steven Landsburg has a new book out and a blog for it. In a post about arguments for/against God, he says this:
So how many whoppers is that? Let's see: the max-compressed encoding of the human genome is insufficient data to describe the working of human life. The natural numbers and operations thereon are extremely simple because it takes very little to describe how they work. This complexity is not the same as the complexity of a specific model implemented with the natural numbers.
His description of it as emerging all at once is just confused: yes, people use natural numbers to describe nature, but this is not the same as saying that the modeling usefulness emerged all at once, which is the sense in which he was originally using the term.
What's scary is he supposedly teaches more math than economics.
Disclosure: Landsburg's wife banned me from econlog.econlib.org a few years ago.
I'm probably exposing my ignorance here, but didn't zero have a historical evolution, so to speak? I'm going off vague memories of past reading and a current quick glance at wikipedia, but it seems like there were separate developments of using place holders, the concept of nothing, and the use of a symbol, which all eventually converges onto the current zero. Seems like the evolution of a number to me. And it may be a just so story, but I see it as eminently plausible that humans primarily work in base 10 because, for the most part, we have 10 digits, which again would be dictated by the evolutionary process.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn't it then necessarily <i>more</i> complex, since it requires all of complex numbers plus it's own set of rules and knowledge base as well?
The ban was probably for the best Silas, you were probably confusing everyone with the facts.
It sounds like a true story (note etymology of the word "digit"). But lots of human cultures used other bases (some of them still exist). Wikipedia lists examples of bases 4, 5, 8, 12, 15, 20, 24, 27, 32 and 60. Many of these have a long history and are (or were) fully integrated into their originating language and culture. So the claim that "humans work in base 10 because we have 10 digits" is rather too broad - it's at least partly a historical accident that base 10 came to be used by European cultures which later conquered most of the world.
That's a good point, Dan. I guess we'd have to check what the number of base 10 systems were vs. overall systems. Though I would continue to see that as again demonstrating an evolution of complex number theory, as multiple strands joined together as systems interacted with one another. There were probably plenty of historical accidents at work, like you mention, to help bring about the current system of natural numbers.
Your recollection is correct: the understanding of math developed gradually. My criticism of Landsburg was mainly that he's not even using a consistent definition of math.
And as you note, under reasonable definitions of math, it did develop gradually.
Yes, exactly. That's why human life is more complex than the string representing the genome: you also have to know what that (compressed) genome specification refers to, the chemical interactions involved, etc.
:-)
Why does DNA encoding need complex numbers? I'm pretty sure simple integers are enough... Maybe you meant the "complexity of natural numbers" as quoted?
Sounds good to me (that's what I get for typing quickly at work).