AustinLorenz

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I finally had time to read this, and I must say, that's an extremely creative premise. I'm puzzled by the proof of theorem 3.1, however. It claims "By inspection of FairBot, PA⊢◻(FB(FB) = C)-->FB(FB=C)."

However, by inspection of fairbot, the antecedent of the conditional should be FB(FB) = C rather than ◻(FB(FB) = C). The code says "if it's a theorem of PA that X cooperates with me, then cooperate." It doesn't say "if it's a theorem of PA that it's provable in PA that X cooperates with me, then cooperate." So I don't believe you can appeal to Lob's theorem in this case. .

I'm not sure if this post is obsolete, but it was linked to by a recent post, so I will provide feedback.

The translation to the formal version goes from "if a theory T has a short proof that any proof of T's inconsistency must be long, then T is inconsistent" to "if T proves in n symbols that 'T can't prove a falsehood in f(n) symbols", then T is inconsistent.'"

But inconsistency is not the same as proving a falsehood. Assuming that PA is consistent, the theory PA+~Con(PA) is consistent, since PA cannot prove its own consistency, but this consistent theory proves the false theorem ~ConPA.

Also, there exists no truth predicate Tru(n) within any consistent extension of PA--otherwise, we could formulate a liar paradox. Hence the notion of a "falsehood" is not expressible in our formal system.

In the proof, we define a program R which enumerates the theorems of the theory T that can be proved with g(n) symbols or less. Such a program must describe the formal system T to utilize its axioms and rules of inference. Assume that T is describable using t symbols. Then the program R requires log g(n) bits to describe the halting condition and t bits to describe T, so we can write the program R using around log g(n) + t bits.

This means that when R is searching for proofs about its behavior of length less than g(n), some of the g(n) symbols will have to describe R itself using log g(n) + t bits. This means that if R does not find a proof, it has only established that there is no proof in T that R halts or not of g(n) - (log g(n) + t) symbols. There may well exist a proof in T that R halts or not of g(n) symbols.

Now T is supposed to prove in n symbols that "T can't prove a falsehood in f(n) symbols," but n << g(n) and exp g(n) << f(n). This implies that log f(n) >> n, which means that n symbols do not suffice to describe a number of size f(n). Thus it is impossible to prove that "T can't prove a falsehood in f(n) symbols" using n, or slightly more than n, symbols.

Actually, "not proving a falsehood" is not the same as being consistent; assuming that PA is consistent, the theory PA+~Con(PA) is also consistent, but proves the false statement ~Con(PA). Consistency is the weaker condition of not proving both a formula and its negation.