Followup to: What's a "natural number"?
While thinking about how to make machines understand the concept of "integers", I accidentally derived a tiny little math result that I haven't seen before. Not sure if it'll be helpful to anyone, but here goes:
You're allowed to invent an arbitrary scheme for encoding integers as strings of bits. Whatever encoding you invent, I can give you an infinite input stream of bits that will make your decoder hang and never give a definite answer like "yes, this is an integer with such-and-such value" or "no, this isn't a valid encoding of any integer".
To clarify, let's work through an example. Consider an unary encoding: 0 is 0, 1 is 10, 2 is 110, 3 is 1110, etc. In this case, if we feed the decoder an infinite sequence of 1's, it will remain forever undecided as to the integer's value. The result says we can find such pathological inputs for any other encoding system, not just unary.
The proof is obvious. (If it isn't obvious to you, work it out!) But it seems to strike at the heart of the issue why we can't naively explain to computers what a "standard integer" is, what a "terminating computation" is, etc. Namely, if you try to define an integer as some observable interface (get first bit, get last bit, get CRC, etc.), then you inevitably invite some "nonstandard integers" into your system.
This idea must be already well-known and have some standard name, any pointers would be welcome!
You've thought more about explaining things to machines than I have, but I'm not sure what you consider to count as having successfully explained a concept to a machine. For example, if you tell the machine that any string with both a beginning and an end is finite, then you've given the machine at least some sort of explanation of finiteness - one which I presume you consider to be inadequate. But when I try to imagine what you might find to be inadequate about it, my naive guess is that you're thinking something like this: "even if you tell a machine that a finite string has an end, that won't help the machine to decide whether a given input has reached its end". In other words, the definition doesn't give the machine the ability to identify infinite strings as such, or to identify finite strings as such in a time shorter than the length of the string itself. But my response to this is that our own human understanding of finiteness doesn't give us that ability either. Just for starters, we don't know whether the universe goes on forever in all directions or closes in on itself as a four-dimensional ball - and we might never know.
I want it to be able to do some of the things that humans can do, e.g. arrive at the conclusion that the Paris-Harrington theorem is "true" even though it's independent of PA. Humans manage that by having a vague unformalized concept of "standard integers" over which it is true, and then inventing axioms that fit their intuition. So I'm kicking around different ways to make the "standard integers" more clear.