You are right, in that those statements do satisfy the conclusion, and I don't see why you're being downvoted. The difference between them and the original, self referential, statement is generality. For example:
You: I know that ZF set theory is incomplete because the axiom of choice cannot be proven within it.
Some other guy: Okay then, how about we add the axiom of choice as another axiom, maybe this new system will be complete?
You: Nope, it still can't prove or disprove the continuum hypothesis.
SUG: So I'll add that in as another axiom, maybe that will finally patch it?
You: Nope, still doesn't work because ...
SUG: But if I add that as an axiom as well...
Hopefully you can see why this might keep going for quite a while, and given that its quite difficult to prove a statement is undecidable you will run out eventually. There's nothing you can do to convince him those undecidable statements are symptoms of a general problem rather than just one-time flaws.
Compare to this:
Godel: ZF set theory is not complete, because I have this self-referential construction of an unprovable statement.
SUG: But what if I add your statement as another axiom?
Godel: Then my proof still applies to this new system you created, and I can construct another, similar statement.
SUG: But what if I...
Godel: There's no point in you trying to continue this, whatever system you create my theorem will always apply.
SUG: Drat! Foiled again!
(Why is some other guy SUG and not SOG? Oversight? Euphony?)
Followup to: What's a "natural number"?
While thinking about how to make machines understand the concept of "integers", I accidentally derived a tiny little math result that I haven't seen before. Not sure if it'll be helpful to anyone, but here goes:
You're allowed to invent an arbitrary scheme for encoding integers as strings of bits. Whatever encoding you invent, I can give you an infinite input stream of bits that will make your decoder hang and never give a definite answer like "yes, this is an integer with such-and-such value" or "no, this isn't a valid encoding of any integer".
To clarify, let's work through an example. Consider an unary encoding: 0 is 0, 1 is 10, 2 is 110, 3 is 1110, etc. In this case, if we feed the decoder an infinite sequence of 1's, it will remain forever undecided as to the integer's value. The result says we can find such pathological inputs for any other encoding system, not just unary.
The proof is obvious. (If it isn't obvious to you, work it out!) But it seems to strike at the heart of the issue why we can't naively explain to computers what a "standard integer" is, what a "terminating computation" is, etc. Namely, if you try to define an integer as some observable interface (get first bit, get last bit, get CRC, etc.), then you inevitably invite some "nonstandard integers" into your system.
This idea must be already well-known and have some standard name, any pointers would be welcome!