J. Y. Girard, et al. (1989). Proofs and types. Cambridge University Press, New York, NY, USA. (PDF)

I found introductory description of a number of ideas given in the beginning of this book very intuitively clear, and these ideas should be relevant to our discussion, preoccupied with the meaning of meaning as we are. Though the book itself is quite technical, the first chapter should be accessible to many readers.

From the beginning of the chapter:

Let us start with an example. There is a standard procedure for multiplication, which yields for the inputs 27 and 37 the result 999. What can we say about that?

A first attempt is to say that we have an equality

27 × 37 = 999

This equality makes sense in the mainstream of mathematics by saying that the two sides denote the same integer and that × is a function in the Cantorian sense of a graph.

This is the denotational aspect, which is undoubtedly correct, but it misses the essential point:

There is a finite computation process which shows that the denotations are equal. It is an abuse (and this is not cheap philosophy — it is a concrete question) to say that 27 × 37 equals 999, since if the two things we have were the same then we would never feel the need to state their equality. Concretely we ask a question, 27 × 37, and get an answer, 999. The two expressions have different senses and we must do something (make a proof or a calculation, or at least look in an encyclopedia) to show that these two senses have the same denotation.

Concerning ×, it is incorrect to say that this is a function (as a graph) since the computer in which the program is loaded has no room for an infinite graph. Hence we have to conclude that we are in the presence of a finitary dynamics related to this question of sense.

Whereas denotation was modelled at a very early stage, sense has been pushed towards subjectivism, with the result that the present mathematical treatment of sense is more or less reduced to syntactic manipulation. This is not a priori in the essence of the subject, and we can expect in the next decades to find a treatment of computation that would combine the advantages of denotational semantics (mathematical clarity) with those of syntax (finite dynamics). This book clearly rests on a tradition that is based on this unfortunate current state of affairs: in the dichotomy between infinite, static denotation and finite, dynamic sense, the denotational side is much more developed than the other.

So, one of the most fundamental distinctions in logic is that made by Frege: given a sentence A, there are two ways of seeing it:

  • as a sequence of instructions, which determine its sense, for example A ∨ B means "A or B", etc.
  • as the ideal result found by these operations: this is its denotation.

    "Denotation", as opposed to "notation", is what is denoted, and not what denotes. For example the denotation of a logical sentence is t (true) or f (false), and the denotation of A ∨ B can be obtained from the denotations of A and B by means of the truth table for disjunction.

Two sentences which have the same sense have the same denotation, that is obvious; but two sentences with the same denotation rarely have the same sense. For example, take a complicated mathematical equivalence A⇔B. The two sentences have the same denotation (they are true at the same time) but surely not the same sense, otherwise what is the point of showing the equivalence?

This example allows us to introduce some associations of ideas:

  • sense, syntax, proofs;
  • denotation, truth, semantics, algebraic operations.

That is the fundamental dichotomy in logic. Having said that, the two sides hardly play symmetrical roles!

Read the whole chapter (or the first three chapters). Download the book here (PDF).

New Comment
12 comments, sorted by Click to highlight new comments since:

There is a field called philosophy of language. Have you heard of it? Here are some key papers/links:

On Sense and Reference by Frege

On Denoting by Russell

Reference and Definite Descriptions by Donnellan

SEP Entry on Reference

Kripke's Naming and Necessity Lectures (Wohooo I didn't know this was freely available... I might reread it now...)

A.P. Martinich's Standard Philosophy of Language Anthology

Now you are an educated man...

Downvoted for sarcasm.

Finding Girard's book useful and recommendable doesn't mean that one is unacquainted with its intellectual antecedents. But it is recommendable in part because it serves as a good introduction to some of that prior work. At least, it is a good introduction to those with a certain background and needs.

I agree that this is one of the best books that is (freely!) available from one of the most respected researchers of the area.

To me, Curry-Howard duality is one of the most intriguing aspects of type theory, but it requires a lot of meditation and getting used to before one can productively think about it.

Sophisticated type theory is obviously a very important stepping stone to develop provably correct programs, but it may be one of the crucial ingredients to generate new programs automatically, and therefore to self-modifying AI. An interesting (but not the first) paper in this direction is:

http://www.site.uottawa.ca/~afelty/dist/gecco08.pdf

which utilizes the System F and the C-H-correspondence to achieve a reasonable performance in program-generation in simple limited cases.

Maybe it's just me, but I found the paper a total fail. Impatient readers should skip to sections 5.1 and 5.2 right away. Basically, the authors are trying to "evolve" simple functions by combining and recombining typed terms. First they "evolve" boolean-xor, sum-of-list and product-of-list, tweaking the "evolution" parameters until they get the right answer. Then they attempt to "evolve" increment-all-elements-of-list-by-one and fail.

Results like this are why I left computer science and started programming for money.

Results like this are why I left computer science and started programming for money.

Which is what you would have had to do anyway if you'd stayed in computer science. So, good call.

Yet more endorsement of Frege. Not a man to be sold short.

Okay, I just tried to read through the thing, and it looks much too difficult for what it's trying to do. If anyone's really interested in the topic, I'd recommend starting with something like the Wikipedia page for System F and assimilating more material as needed; that'll certainly be easier for any programmer like me.

[-]Pfft30

It's a great book, but in my experience, it's more useful as a motivator than as a source of information. That is, after reading Girard's cryptic and tantalizing off-hand remarks, you are dying to know what on earth he is talking about.

I think part of the reason is that the book is based on notes from a series of lectures Girard gave; in my experience oral lectures tend to be a lot more sketchy than things which are written down (others disagree...). The other part of the reason is that this is Girard. :)

For readers who are new to lambda calculus or proof theory but are interested to learn, I'd recommend supplementing it with a gentler text like Sørensen and Urzyczyn's Lectures on the Curry-Howard Isomorphism (book, shorter free online version), which contains actual explanations. The first few chapters of this covers the same material as the first few chapters of Girard's book. (They diverge a bit towards the end).

It depends on your background. I read a lot on program semantics before opening this book, but never systematically on logic. It was genuinely helpful (and counting -- I haven't finished it yet).

If the goal is good understanding of "foundations" (however Girard himself may mock the concept), I see the book as being at just the right level.

I slogged through the first 30 pages. Section 2.2.1 "Interpretation of the rules" was surprisingly unclear. I could make sense of it only after using the Curry-Howard isomorphism given later on (because I'm already familiar with typed functional programming).

Section 3.3 on "arbitrary pluggings" and "variables and values as dual aspects of the same plugging phenomenon" is both informal and unclear.

I also didn't see any definition of "abstraction", "closed", or "strictly simpler" in the corollary (p. 19) in section 3.4 on (head) normal form. "B strictly simpler (type) than A" probably means that A is built on top of B by one of:

  • A=B x X
  • A=X x B
  • A=B->X
  • A=X->B.

The material is either not self-contained, or out of order (I didn't read the whole thing).

The book assumes a bit of familiarity with the subject and isn't bureaucratic in formality. Many ideas are hard to present formally or aren't even understood formally yet. I find the informal discussion in this book very valuable. It allows to see the motivation behind logic beyond text-crunching right from the beginning.

A closed term is one where all variables are bound (no free variables). Abstraction is a lambda-expression.