Followup to: What's a "natural number"?
While thinking about how to make machines understand the concept of "integers", I accidentally derived a tiny little math result that I haven't seen before. Not sure if it'll be helpful to anyone, but here goes:
You're allowed to invent an arbitrary scheme for encoding integers as strings of bits. Whatever encoding you invent, I can give you an infinite input stream of bits that will make your decoder hang and never give a definite answer like "yes, this is an integer with such-and-such value" or "no, this isn't a valid encoding of any integer".
To clarify, let's work through an example. Consider an unary encoding: 0 is 0, 1 is 10, 2 is 110, 3 is 1110, etc. In this case, if we feed the decoder an infinite sequence of 1's, it will remain forever undecided as to the integer's value. The result says we can find such pathological inputs for any other encoding system, not just unary.
The proof is obvious. (If it isn't obvious to you, work it out!) But it seems to strike at the heart of the issue why we can't naively explain to computers what a "standard integer" is, what a "terminating computation" is, etc. Namely, if you try to define an integer as some observable interface (get first bit, get last bit, get CRC, etc.), then you inevitably invite some "nonstandard integers" into your system.
This idea must be already well-known and have some standard name, any pointers would be welcome!
Godel was working under a philosophical system that assumed that everything could be deconstructed into first order logic. He disagreed and essentially wrote "this statement is false" as a proof that it can not be done. However, he also points out in his proof that there is an infinite number of such statements and that they do not have to be self-referential. Nor was what he was doing exploiting an interesting property of the integers but exploiting an interesting property of the system that creates the integers. This decoder can tell if something is an integer and tell one what integer it is, therefore it is able to evaluate everything needed for Godels proof, therefore it must have not only one but an infinite number of cases for which it hangs.
You're wrong. You're arguing from surface similarity rather than detailed internal workings, which is a big no-no in maths. I could have spent about five paragraphs breaking down your argument point by point, but you've made my life easy with that final line:
This is not the case, there is a simple counter-example. Unary code, in which 0 is 0, 1 is 10, 2 is 110, 3 is 1110, 15 is 1111111111111110, hangs on the string 111111.... and no other. The fact your argument led to this conclusion is a demonstration of how completely wrong it is.