This thread is for the discussion of Less Wrong topics that have not appeared in recent posts. Feel free to rid yourself of cached thoughts by doing so in Old Church Slavonic. If a discussion gets unwieldy, celebrate by turning it into a top-level post.
If you're new to Less Wrong, check out this welcome post.
I'm probably exposing my ignorance here, but didn't zero have a historical evolution, so to speak? I'm going off vague memories of past reading and a current quick glance at wikipedia, but it seems like there were separate developments of using place holders, the concept of nothing, and the use of a symbol, which all eventually converges onto the current zero. Seems like the evolution of a number to me. And it may be a just so story, but I see it as eminently plausible that humans primarily work in base 10 because, for the most part, we have 10 digits, which again would be dictated by the evolutionary process.
On his human life, point, if DNA encoding encompasses all of complex numbers (being that it needs that system in order to be described), isn't it then necessarily more complex, since it requires all of complex numbers plus it's own set of rules and knowledge base as well?
The ban was probably for the best Silas, you were probably confusing everyone with the facts.
Why does DNA encoding need complex numbers? I'm pretty sure simple integers are enough... Maybe you meant the "complexity of natural numbers" as quoted?