Those are statements that fall under what Godel proved, that is they are statements that are unprovable in ZF. So even though his statement doesn't include self-reference it can still fall under Godel's proof if his decoder is strong enough to determine what is an integer and what is not an integer. Self-referencing has nothing to do with it at all.
The existence of specific undecidable statements in ZF or ZF - AC is a different sort of result than what Godel showed. That for example the continuum hypothesis is undecidable in ZFC is interesting because the continuum hypothesis is interesting. However, Godel's theorems show that any consistent, axiomatizable systemn with all valid proofs recursively enumerable, that is strong enough to model a large chunk of N, must be incomplete. Exhibiting results like Cohen's results about choice and the continuum hypothesis don't give you the full result, they just show specific things about the system ZF. Indeed, if one didn't know Godel's theorems, and just knew about Cohen's forcing results, you might be tempted to just throw in AC and GH as additional axioms if you didn't have them already and then wonder if that system is complete.
It is very funny to me that my most downvoted comment isn't about religion but is about Godel's proof and no one gave a decent refutation of what I said.
People here have a very complicated set of attitudes. Even as many don't want to bother talking about irrational aspects of religion, or engaging in religious individuals who are convinced that their religious view is somehow different from all the other religious views, people who are religious are that way due to a variety of very strong cognitive biases. So there's some sympathy there. Getting math wrong and then insisting one is correct is just simple arrogance or at least, only Dunning-Kruger. There's much less sympathy there. And yes, people have tried to explain why you were wrong albeit fairly succinctly.
Dunning-Kruger.
Possibly.
However, what throws me is "we can find such pathological inputs for any other encoding system," which to me implies a stronger system is being thought of which would cause the system to hang for some inputs as it would fall under Godel's proof.
Followup to: What's a "natural number"?
While thinking about how to make machines understand the concept of "integers", I accidentally derived a tiny little math result that I haven't seen before. Not sure if it'll be helpful to anyone, but here goes:
You're allowed to invent an arbitrary scheme for encoding integers as strings of bits. Whatever encoding you invent, I can give you an infinite input stream of bits that will make your decoder hang and never give a definite answer like "yes, this is an integer with such-and-such value" or "no, this isn't a valid encoding of any integer".
To clarify, let's work through an example. Consider an unary encoding: 0 is 0, 1 is 10, 2 is 110, 3 is 1110, etc. In this case, if we feed the decoder an infinite sequence of 1's, it will remain forever undecided as to the integer's value. The result says we can find such pathological inputs for any other encoding system, not just unary.
The proof is obvious. (If it isn't obvious to you, work it out!) But it seems to strike at the heart of the issue why we can't naively explain to computers what a "standard integer" is, what a "terminating computation" is, etc. Namely, if you try to define an integer as some observable interface (get first bit, get last bit, get CRC, etc.), then you inevitably invite some "nonstandard integers" into your system.
This idea must be already well-known and have some standard name, any pointers would be welcome!