AnlamK comments on Open Thread: November 2009 - Less Wrong

3 [deleted] 02 November 2009 01:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (539)

You are viewing a single comment's thread. Show more comments above.

Comment author: SilasBarta 03 November 2009 03:48:12AM 12 points [-]

Just another example of a otherwise-respectable (though not by me) economist spouting nonsense. I thought you guys might find it interesting, and it seemed short for a top-level post.

Steven Landsburg has a new book out and a blog for it. In a post about arguments for/against God, he says this:

the most complex thing I’m aware of is the system of natural numbers (0,1,2,3, and all the rest of them) together with the laws of arithmetic. That system did not emerge, by gradual degrees, from simpler beginnings.

If you doubt the complexity of the natural numbers, take note that you can use just a small part of them to encode the entire human genome. That makes the natural numbers more complex than human life.

So how many whoppers is that? Let's see: the max-compressed encoding of the human genome is insufficient data to describe the working of human life. The natural numbers and operations thereon are extremely simple because it takes very little to describe how they work. This complexity is not the same as the complexity of a specific model implemented with the natural numbers.

His description of it as emerging all at once is just confused: yes, people use natural numbers to describe nature, but this is not the same as saying that the modeling usefulness emerged all at once, which is the sense in which he was originally using the term.

What's scary is he supposedly teaches more math than economics.

Disclosure: Landsburg's wife banned me from econlog.econlib.org a few years ago.

Comment author: AnlamK 05 November 2009 11:19:30PM *  0 points [-]

What is the notion of complexity in question? It could for instance be the (hypothetically) shortest program needed to produce a given object, i.e. Kolmogorov complexity.

In that case, the natural numbers would have a complexity of infinity, which would be much greater than any finite quantity - i.e. a human life.

I may be missing something because the discussion to my eyes seems trivial.

Comment author: RobinZ 05 November 2009 11:31:42PM *  3 points [-]

The complexity doesn't count the amount of data storage required, only the length of the executable code.

n = 1;
while n>0
print n;
n++;
end

looks simple to me.

Comment author: AnlamK 05 November 2009 11:49:03PM *  0 points [-]

Yes, but how are you going to represent 'n' under the hood? You are going to need eventually infinite bits to represent it? I guess this is what you mean by storage. I should confess that I don't know enough about alogrithmic information theory so I may be in deeper waters than I can swim. I think you are right though...

I had something more in mind like, the number of bits required to represent any natural number, which is obviously log(n) (or maybe 2loglog(n) - with some clever tricks I think) and if n can get as big as possible, then the complexity, log(n) also gets arbitrarily big.

So maybe the problem of producing every natural number consecutively has a different complexity from producing some arbitrary natural number. Interesting...

Comment author: RobinZ 06 November 2009 01:28:04AM *  2 points [-]

Someone else should be the one to say this (do we have an information theorist in the house?), but my understanding is that Kolmogov complexity does not account for memory usage problems (e.g. by using Turing machines with infinite tape). And thus producing a single specific sufficiently large arbitrary natural number is more complex than producing the entire list - because "sufficiently" in this case is "longer than the program which produces the entire list".

Comment author: Emile 03 March 2010 03:28:14PM *  3 points [-]

Yup, Kolmogorov complexity is only concerned with the length of the shortest algorithm. There are other measures (more rarely used, it seems) that take into account things like memory used, or time (number of steps), though I can't remember their name just now.

Note that usually the complexity of X is the size of the program that outputs X exactly, not the program that outputs a lot of things including X. Otherwise you can write a quite short program that outputs, say, all possible ascii texts, and claim that it's size is an upper bound of the complexity of the Bible. Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts.

Example in Python:

chars = '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'+\
'"!#$%&\'()*+,-./:;<=>?@[\\]^_`{|}~ \t\n\r'
def iter_strings_of_size(n):
if n <= 0:
yield ''
else:
for string in iter_strings_of_size(n-1):
for char in chars:
yield string + char
def iter_all_strings():
n = 0
while True:
for string in iter_strings_of_size(n):
yield string
n = n + 1
Comment author: Liron 17 March 2010 08:14:58AM 1 point [-]

This has improved my understanding of the Python "yield" statement.

Comment author: Emile 17 March 2010 10:14:43AM 1 point [-]

Glad I could be of use !

Understanding the Power of Yield was a great step forwards for me, afterwards I was horrified to reread some old code that was riddled with horrible classes like DoubleItemIterator and IteratorFilter that my C++-addled brain had cooked up, and realized half my classes were useless and the rest could have there linecount divided by ten.

And some people still cound "lines of code" as a measure of productivity. Sob.

Comment author: Mitchell_Porter 17 March 2010 11:30:44AM 3 points [-]

And some people still count "lines of code" as a measure of productivity.

So that's why my self-enhancing AI keeps getting bogged down!

Comment author: NancyLebovitz 17 March 2010 03:30:20PM 1 point [-]

Voted up for being really funny.

Comment author: Hook 03 March 2010 04:35:10PM *  1 point [-]

"Actually, the information needed to generate the Bible is the same as the information to locate the Bible in all those texts."

Or to locate it in "The Library of Babel".

Comment author: AnlamK 06 November 2009 09:11:57AM 0 points [-]

I actually took information theory but this is more of an issue algorithmic information theory - something I have not studied all that much. Though still, I think you are probably right since Kolgomorov complexity refers to descriptive complexity of an object. And here you can give a much shorter description of all of consecutive natural numbers.

This is very interesting to me because intuitively one would think that both are problems involving infinity and hence I lazily thought that they would both have the same complexity.