Movatterモバイル変換


[0]ホーム

URL:


SEP home page
Stanford Encyclopedia of Philosophy

The Development of Proof Theory

First published Wed Apr 16, 2008; substantive revision Mon Oct 13, 2014

The development of proof theory can be naturally divided into: the prehistory of the notion of proof in ancient logic and mathematics; the discovery byFrege that mathematical proofs, and not only the propositions of mathematics, can (and should) be represented in a logical system;Hilbert's old axiomatic proof theory; failure of the aims of Hilbert through Gödel's incompleteness theorems; Gentzen's creation of the two main types of logical systemsof contemporary proof theory, natural deduction and sequent calculus (see the entry onautomated reasoning); applications and extensions of natural deduction and sequent calculus, up to the computational interpretation of natural deductionand its connections with computer science.

1. Prehistory of the notion of proof

Proof theory can be described as the study of the general structure of mathematical proofs, and of arguments with demonstrative force as encountered in logic. The idea of such demonstrative arguments, i.e.,ones the conclusion of which follows necessarily from the assumptionsmade, is central in Aristotle'sAnalytica Posteriora: a deductive science is organised around a number of basic concepts thatare assumed understood without further explanation, and a number of basic truths or axioms that are seen as true immediately. Defined concepts and theorems are reduced to these two, the latter through proof. Aristotle's account of proof as demonstrative argument fits very well to the structure of ancient geometry as axiomatized in Euclid. The specific form ofAristotle's logic, the theory of syllogism has instead, so it seems, almost nothing to do with proofs in Euclidean geometry. These proofs remained intuitivefor more than two thousand years.

Before the work of Frege in 1879, no one seems to have maintained thatthere could be a complete set of principles of proof, in the senseexpressed by Frege when he wrote that in his symbolic language,

all that is necessary for a correct inference is expressed infull, but what is not necessary is generally not indicated; nothing isleft to guesswork.(Begriffsschrift, p. 3)

(One might contend that Boole is anexception as far as classical propositional logic is concerned.)Frege's step ahead was decisive for the development of logic andfoundational study. The contrast to the ancients is great: Aristotlegave a pattern for combining arguments, but the idea of a finiteclosed set of rules was, philosophically, beyond the dreams of anyonebefore Frege, with the possible exception of Leibniz.

As we know today, Frege's principles of proof are complete for classical predicate logic.

Around 1890, Giuseppe Peano gave a formalization of logical inference,with the aim of representing formally proofs in arithmetic. Hisseminal paperArithmetices principia, nova methodo exposita,written originally in Latin, is included in English translation in thecollection From Frege to Gödel (1967) that Jean van Heijenoortedited. Unfortunately, the editor failed to recognize what Peano didwith formal inference, and the view spread that Peano formalized onlythe language of logic and arithmetic, not its principles of proof. IfPeano's proofs are read with even a little care, it transpires thatthey are purely formal derivations that use two principles:

  1. Axioms imply their instances. Such implications can be written as lines inproofs.
  2. An implication and its antecedent imply together theconsequent.

Peano is very careful to list, on every line of hisderivations, what the formal ground for writing the line is.

Russell took up Frege's logic, but used the notation and formal rules ofproof of Peano, in a paper of 1906 with the title “The theory ofimplication.” Its formal machinery is exactly the same asPeano's. In later work, Russell changed the axiomatic system and theone ofPrincipia Mathematica (Whitehead and Russell 1910–13) becamestandard. Russell's philosophical idea, and here he followed Frege,was that the axioms express basic logical truths, and other logicaltruths are derived from these through modus ponens and universalgeneralization, the two principles Frege had identified. Mathematicswas to be reduced to logic, so that its proofs would become presentedin the same axiomatic pattern.

Frege's and Peano-Russell's approach to logic became the universally accepted one, especially through the influence of Hilbert and his co-workers in the 1920s. In the 19th century, Frege was a marginal figure, and the algebraic approach to logic, as in Boole and especially Ernst Schröder, was the dominant one. It is clear that there was a good understanding of the principles of predicate logic in this tradition, for how could there have been a Löwenheim-Skolem theorem otherwise? Skolem found out about Frege's logic throughPrincipia Mathematica only after having worked out the theorem in his paper of 1920. The first section of that paper, widely read because of the English translation in Van Heijenoort'sFrom Frege to Gödel, marks the end of the algebraic tradition of logic that merged silently with lattice theory. Other sections of the paper contain a remarkable beginning ofthe analysis of proofs in lattice theory and in projective geometry: Skolem considered the axioms of a mathematical theory from a purely combinatorial and formal point of view, as means for producing derivations of a formula from given formulas used as assumptions. It was found out in the early 1990s that the part on lattice theory contains the solution of a decision problem, called the word problem for freely generated lattices, the known solution of which stemmed from 1988! Skolem's terminology and notation in lattice theory are those of Schröder and that is part of the reason why his work was a lost opportunity for proof theory.

2. Hilbert's old axiomatic proof theory

Hilbert's bookGrundlagen der Geometrie of 1899 set the stagefor the central foundational problems of mathematics of the earlydecades of the 20th century. We can list these problems asfollows:

  1. The formalization of a mathematical theory. This includes a choiceof its basic objects and relations, and a choice of the axioms.
  2. A proof of the consistency of the axioms.
  3. The question of the mutual independence and completeness of theaxioms.
  4. The decision problem: is there a method for answering any questionthat could be raised within the theory?

As to Hilbert's geometry, its attempted formalization fell short ofthe ideal to which it gave birth. Hilbert found a much more importantfield to which his “metamathematics” was to be applied,namely arithmetic and analysis. The groundwork was a study of the fourfoundational problems in axiomatic formulations of pure logic.Propositional logic was thus formalized, found to be consistent andcomplete, and decidable. The first results about predicate logic arefrom 1915, when Leopold Löwenheim gave his version of what laterbecame the Löwenheim-Skolem theorem for predicate logic (see the entryonclassical logic).He also solved special cases of the decision problem. This developmentwas independent of the Frege-Russell tradition, and was instead basedon the algebraic approach to logic of Ernst Schröder. Around 1920, the“Hilbert style” axiomatic approach, as it is often called,was known to everyone and dominated the logical scene; the algebraicapproach merged almost without notice with lattice theory. By 1928, inHilbert and Ackermann'sGrundzüge der theoretischen Logik, anaxiomatic formal system for predicate logic was presented, togetherwith the problem of its completeness. The latter was solved by Gödelin 1929, published a year later (Gödel 1930). The fourthfoundational problem, the decision problem for predicate logic, wasshown to have a negative solution in a short paper by Church in 1936as a corollary to Gödel's incompleteness theorem.

Hilbert and his school, with Bernays, Ackermann and von Neumann as foremost, as well as young Herbrand in France, pursued the metamathematical study of arithmetic in the latter part of the 1920s.Hilbert developed a method for the study of consistency problems, called theepsilon substitution method, to deal with the quantifiers. He felt that indirect inferences with quantifiers in cases with an infinity of objects were the crucial point of consistency proofs and needed a justification. Say, if the assumption that all natural numbers have the propertyP leads to an impossibility, the existence of a number with the contrary property not-P can be inferred. The central problemwas thus to justify the use of classical logic in mathematical proofs, arithmetical ones in the first place. Ackermann was very close to a solution towards the end of the 1920s and optimism reignedin the Hilbert school. Then, of course, the unexpected happened when Gödel proved the impossibility of a complete formalization of elementary arithmetic, and, as it was soon interpreted, the impossibility of proving the consistency of arithmetic by finitary means, the only ones judged “absolutely reliable” by Hilbert.

3. The unprovability of consistency

After Gödel had made public the incompleteness of arithmetic inSeptember 1930, von Neumann found that the consistency of arithmeticwould be among the Gödelian unprovable propositions. Alas, Gödel hadmade the same discovery so von Neumann never published his proof. Hedid, however, conjecture in correspondence with Gödel theunprovability of the consistency of arithmetic and therefore ofmathematics as a whole, in some absolute sense. Von Neumann was thekey character in the reception of Gödel's results: He interrupted hislectures on Hilbert's proof theory in Berlin in the fall of 1930 toexplain the new discoveries. These events created an enormousexcitement among the mathematicians, as witnessed by Carl Hempel'stestimony:

I took a course there with von Neumann which dealt with Hilbert'sattempt to prove the consistency of classical mathematics by finitarymeans. I recall that in the middle of the course von Neumann came inone day and announced that he had just received a paper from… KurtGödel who showed that the objectives which Hilbert had in mindand on which I had heard Hilbert's course in Göttingen could notbe achieved at all. Von Neumann, therefore, dropped the pursuit ofthis subject and devoted the rest of the course to the presentation ofGödel's results. The finding evoked an enormous excitement.(Hempel 2000, p. 13)

In 1932–33, Gödel and Gentzen found independently of each other a translation from classical Peano arithmetic tointuitionistic Heyting arithmetic. Specifically, it shows that if a contradiction is provable in the former, it is provable in the latter. Then the consistency of intuitionistic arithmetic would guarantee also the consistency of classical arithmetic. This result was a surprise: as mentioned, Hilbert had thought that the “transfinite” indirect existence proofs would be the part of arithmetic that needs to be secured of contradiction. By Gödel's and Gentzen's result, already intuitionistic arithmetic contained principles that went beyond finitary reasoning. A letter Gentzen wrote to Heyting on 25 February 1933 summarizes the situation as follows:

A consistency proof by finite means has … not succeeded so far,so that this original aim of Hilbert has not been achieved. If, on theother hand, one admits the intuitionistic position as a secure basisin itself, i.e., as a consistent one, the consistency of classicalarithmetic is secured by my result. If one wished to satisfy Hilbert'srequirements, the task would still remain of showing intuitionisticarithmetic consistent. This, however, is not possible by even theformal apparatus of classical arithmetic, on the basis of Gödel'sresult in combination with my proof. Even so, I am inclined to believethat a consistency proof for intuitionistic arithmetic, from an evenmore evident position, is possible and also desirable. (Menzler-Trott2007, p. 38)

The last-mentioned was the goal Gentzen had set to himself early in 1932, when in a letter to his old teacher Hellmuth Kneser he wrote:

I have set as my specific task to find a proof of the consistency oflogical deduction in arithmetic… The task becomes a purelymathematical problem through the formalization of logicaldeduction. The proof of consistency has been so far carried out onlyfor special cases, for example, the arithmetic of the integers withoutthe rule of complete induction. I would like to proceed further atthis point and to clear at least arithmetic with complete induction. Iam working on this since almost a year and hope to finish soon, andwould then present this work as my dissertation (with Prof. Bernays).(Menzler-Trott 2007, p. 31)

4. Natural deduction and sequent calculus

In pursuing his consistency program, Gentzen set as his first task theanalysis of purely logical deduction, to be extended later toarithmetic and analysis. In his thesis (1934–1935), Gentzenstates that he set as his task the analysis of mathematical proofs asthey occur in practice. The first observation is that actual proofsare not based on axioms expressed in a logical language, as inHilbert's axiomatic proof theory. The most typical feature is insteadthat theorems make their claims under some assumptions. Theassumptions are analyzed into parts and the conclusion is alsoanalyzed into parts until these two analyses meet and a proof can besynthesized. The latter analysis proceeds by what Gentzen calledintroduction rules: they give sufficient conditions for deriving aproposition of a given form. For example, to derive a conjunctionA &B, it is sufficient toderive the conjunctsA andB separately. Theinference is given formally as in the rule

A  B &I
A &B

Assumptions, instead, are analyzed into their components through elimination rules that give, by and large, immediate consequences of the assumptions. For example, a conjunction used as an assumption canbe decomposed into its constituents, as in the rules

A &B &E1
A
   
A &B &E2
B

Gentzen developed and studied the system of natural deduction during 1932, and had arrived by September 1932 at a calculus of natural deduction (ND for short) that is standard today. By this time, he hadnoticed that if an introduction, say a derivation ofA &B fromA andB separately, is followed by the corresponding elimination, say a derivation ofA, the formulaA &B constitutes a localmaximum, a “hillock”, that can be eliminated. He also called such hillocks “detours”, and what is now called a detour conversion removes such unnecessary pairs of introduction-elimination steps. The result of the steps of “normalization” is a derivation in “normal form”.

Implication is perhaps more typical of ND than conjunction: to deriveAB, one temporarily assumesA, then tries to deriveB. If this succeeds, the temporary assumption is closed or “discharged” when the conclusion toAB is made, as in the schematic derivation

[A]
B ⊃I
AB

In the other direction,AB can be eliminated if a derivation ofA has been found, for thenB canbe concluded:

AB  A  ⊃E
B

If rule⊃I is followed by⊃E, there is a non-normality that is removed by a detour conversion: a derivation ofB (and what follows after it) is constructed by taking the derivation of theminor premissA of the elimination rule and the derivation ofB from the assumptionA in the introduction. These two pieces are combined together into a derivation ofB that does not have the detour formulaAB. In Gentzen's thesis, all assumptions are in the end closed by implication introductions, but nowadays one considers also derivations that leave a collection of formulas as open assumptions.

Looking at the rules of conjunction and implication, one notes that the premisses (the formulas immediately above the inference line) aresubformulas of the conclusion in theI-rules, whereas it is the otherway around in theE-rules. Gentzen noticed that in normal derivations, this property of single steps is inherited by the whole derivation, in the sense that all formulas are subformulas of the conclusion. This result gave as a byproduct a decision method forintuitionistic propositional logic. Another corollary was a syntactic proof of consistency: if a contradiction is provable, anything is provable, but an atomic formula, say, has no proof: if it has a proof, it has a normal proof,but noE-rules apply to an atomic formula, and noI-rule concludes iteither.

Gentzen's idea was to extend natural deduction to a system of arithmetic by the addition of a rule that corresponds to the principle of complete induction. Consistency would then follow from the normalization of derivations and the subformula property. By early 1933, Gentzen realized that this proof strategy would not go through: the induction rule is schematic and has an infinity of instances, with no bound on the complexity of the induction formulas.It would be impossible to restrict these formulas in advance, thus nosubformula property can hold. After this failure, Gentzen took verbatim out of his early thesis manuscript the translation from classical to intuitionistic arithmetic and submitted it as a paper inMarch 1933, but withdrew the paper after hearing of Gödel's publication of the result.

Gentzen wrote that he was unable to prove a normalization theorem fora classical system of ND. He therefore invented another logical calculus that he called sequent calculus (Sequenzenkalkul, literally “calculus of sequences”) and made it the central topic of his thesis. The name of the calculus comes from the representation of assumptions of a derivation as a list. The word “sequent” used as a noun is a suggestion of Kleene's in hisIntroduction to Metamathematics (1952: 441), taken up in many languages in the form of purely invented words.

Sequent calculus, SC for short, can be seen as a formalrepresentation of the derivability relation in natural deduction. Asequent consists of a list Γ of formulas, an arrow (in Gentzen,later also other markers have been used), and one formula as aconclusion. The list gives the assumptions the conclusion depends onin a derivation, in a local notation where in ND they are found in theleaves of a derivation tree. Gentzen also generalized sequents so thatthey have, instead of one conclusion, a list of possible cases afterthe arrow. This novelty led to the first satisfactory formulation of aproof system for classical logic. Gentzen's SC rules for conjunctionand implication are, with commas separating elements in lists:

Conjunction
Γ Δ,A   Γ → Δ,B  R&
Γ Δ,A &B
A, Γ Δ  L&1
A &B, Γ Δ
B, Γ Δ  L&2
A &B, Γ Δ
Implication
A, Γ Δ,B  R⊃
Γ Δ,AB
Γ Θ,A  B, Δ Λ  L⊃
AB, Γ, Δ Θ, Λ

This is not the place to explain the details of ND and SC (but see theentry onautomatedreasoning). Gentzen formulated the latter, denoted LK, so that itgave an intuitionistic calculus, denoted LJ, as a special case, theone in which the conclusion is a list of at most one case. He thenproved the analogue of the normalization theorem for the classicalcalculus, the calculus and the proof carefully formulated so that theresult for the intuitionistic calculus was a special case of the onefor the classical calculus. In LJ and LK, L stands for“logistic”, a term by which Gentzen refers to theaxiomatic calculi of logic of Frege, Russell, and Hilbert andBernays. In such calculi, each line in a derivation is correct initself, i.e., a logical truth, whence the term. The letters K and Jcome from the German wordsklassisch andintuitionistisch. (The latter should thus be upper case "I",but older German uses upper case "J" for upper case "I".)

Gentzen called the analogue of normalization by the unimaginative name ofHauptsatz, “main theorem”. The standard terminology today is “cut elimination theorem” All of thelogical rules of SC have the subformula property in a very immediate sense: each formula in a premiss is a formula or subformula in the conclusion. The rule for combining derivations, analogous to the one explained above for the case of detour conversions in ND, is called “cut”. In it, a formulaA appears as a case in afirst premiss and as an assumption in a second premiss. In the conclusion, this formula has disappeared and the assumptions of the two premisses collected together:

ΓA  A, ΔC  Cut
Γ, ΔC

Thus, cut is the only rule that makes a formula “disappear” in a derivation. Gentzen showed that instances of the rule of cut can be eliminated from derivations by permuting them upward until they reach points at which the derivationstarts. In ND, the starting points are assumptions, in SC they are “initial sequents” of the formAA in which the assumption formulaA is at the same time the conclusion. A cut with such a sequent as one premiss has the other premiss equal to the conclusion and can therefore be deleted.

After the proof of cut elimination, Gentzen had no use for the proof of normalization for intuitionistic natural deduction. He gave the first handwritten version of his thesis, with the detailed proof of normalization (equivalent to some 13 printed pages) to Bernays, but the latter seems never to have realized what he had in his hands. Theproof, among the papers of Bernays in Zurich, was discovered by the present author in February 2005 and is now available in an English translation (Gentzen 1933 [2008]).

5. The consistency of arithmetic and analysis

After his thesis work on ND and SC for pure logic, Gentzen continuedhis plan of proving the consistency of arithmetic. The result wasready by December 1934. What this very first proof was, is not knownin detail. However, a letter to Bernays from 1938 indicates that theproof that Gentzen wrote down by the summer of 1935 was not thisoriginal one, but a second proof (see Menzler-Trott 2001, 79). Thissecond proof was criticized by Bernays and Gödel who discussed itduring their Atlantic voyage to Princeton in September 1935. Gentzen'sidea in the proof was as follows: first, take theconjunction-negation-universal quantification fragment of naturaldeduction as the logic used in the formalization of arithmetic. Thenwrite each rule instance so that the premisses and conclusion have theopen assumptions listed at left, with an arrow separating theconclusion, so, as sequents. This variant of ND is now called ND in SCstyle. Consider a sequentΓC. If its conclusion is an atomic formula, it is an equation between numbers. In the worst case it is false, so then consider the list of assumptions. If one assumption is a conjunction, replace it by a conjunct of your choice, if a universal quantification, by an instance. If it is a negation ¬A, replace the conclusionbyA. If at any stage of this “reduction process” the conclusion of a sequent is a compound formula, youhave to consider any conjunct or any instance of universal quantification as a possible conclusion. In case of negation ¬A as a conclusion, moveA to the assumption part and replace the conclusion by0 = 1. Gentzen shows that by proceeding in this way under the assumption that the sequent in question is derivable, either a true equation is found as a conclusion, or a false equation as an assumption. Thus, there are no derivable sequents with all assumptions true and the conclusion false.

It was unclear to Gödel and Bernays what the proof assumed; theythought it assumed what is known in intuitionistic mathematics as thefan theorem, but this was false. Termination of Gentzen's reductionprocedure can be proved instead by induction on well-founded trees(“bar-induction”), a principle that was used by Gentzen onintuitive grounds. Anyway, the result of the criticism was thatGentzen changed without further ado the proof into a third proof thatuses the now famous principle of transfinite induction up to the firstepsilon-number. This induction was presented through a coding thatused decimal numbers. The concrete result of the changes for Gentzen'spaper published in 1936 was not good, however: the logical calculuswas changed midway in an article of seventy-odd pages that became verydifficult to read. Therefore Gentzen gave yet another, by the presentcount fourth proof of the consistency of arithmetic in 1938 (at theBernays archives of the ETH Zurich), this time based on the classicalsequent calculus LK of 1933. As mentioned, correspondence withBernays indicates that he thereby returned to the proof method thathad led to success in 1934. The use of transfinite induction is madeclearly visible in the 1938 paper through an ordinal notation. Suchinduction principles on Cantor's “second number class” arediscussed in detail in Hilbert's 1925 lecture “Über dasUnendliche” (“On the infinite”, published1926), a paper to which Gentzen referred.

One would have thought that that was that, but Gentzen had reason to produce even a fourth proof of the consistency of arithmetic, in his last paper published in 1943 but written before the war in 1939. He extended Peano arithmetic through transfinite ordinals and made the transfinite induction principle part of this extended calculus. Then he showed directly that transfinite induction up to the first epsilon-number ε0 is expressible but not provable in the system. Gödel's incompleteness theorem is thus proved in a completely different way. The idea of the proof is, in brief terms,as follows: first it is laid down what it means to derive transfiniteinduction to a specific ordinal number in the system. Secondly, ordinal numbers below ε0 are associated to derivations. These are called “values”. It is then shown that if transfinite induction to an ordinal number is derivable, thisordinal number cannot be greater than the value of the derivation. Therefore transfinite induction to ε0 is not derivable.

Since the induction principle can be expressed but not proved in ordinary arithmetic, a formula unprovable in Peano arithmetic is found. An easy consequence of Gentzen's version of the incompletenesstheorem is the consistency of Peano arithmetic, because anything would be provable in an inconsistent system. Contrary to Gödel's“artificial” unprovable formula that was obtained througha coding of the arithmetized provability predicate, Gentzen's transfinite induction principle is a principle of “ordinary” mathematics.

Gentzen's last proof determined the “proof-theoretic ordinal” of Peano arithmetic, namely the one that is needed to prove consistency, with the property that nothing less would suffice.The work marked the beginning of ordinal proof theory. It was withoutdoubt the most remarkable foundational achievement in arithmetic after Gödel's incompleteness theorems, but is still largely unknown—one can find many books about Gödel's theorems that do not even mention Gentzen.

Gödel had, it seems, not thought of giving a consistency proof of arithmetic through the use of non-finitary but still constructive principles. In the late thirties, at least from 1938 on, he developedas a response to Gentzen's proof his own special interpretation of intuitionistic logic and arithmetic, what came to be known as the Dialectica interpretation. It uses computable functionals to interpret the proofs of intuitionistic arithmetic. Gödel published the interpretation only in 1958, even though he had presented it in lectures in 1941. It is unknown if he discussed the matter when he met Gentzen in December 1939.

At the request of Bernays, Ackermann reproduced Gentzen's proof in terms of Hilbert'sepsilon-calculus in 1940. Ackermann's paper was the starting point of Kreisel's 1951 “no-counterexample” interpretation of arithmetic. It was a surprise when the publication of Gödel's collected papers brought to light his “Zilsel-lecture” in Vienna in 1938: he outlines there this interpretation as a reformulation of Gentzen's1935 proof. (The matter is discussed in great detail in Tait (2005) who himself had worked on the no-counterexample interpretation and its extension to analysis in the 1960s.)

The next obvious task in proof theory, after the proof of the consistency of arithmetic, was to prove the consistency of analysis, i.e., of the theory of real numbers. Gentzen did some work in this direction, but was then assigned to military service in the fall of 1939. (He observed and reported the type, number, and direction of aircraft that flew above the town of Brunswick, until he was hit by anervous breakdown in early 1942.) From 1943 on he resumed the work onanalysis, but the difficulties intrinsic to the topic were great, as were the practical difficulties of life caused by the war. Analysis was to be formulated as a system of second-order arithmetic, which means that quantification is extended over number-theoretic predicates or, equivalently, over sets of natural numbers. Second-order number theory is used in Gentzen's last paper, publishedin 1943, in which it is briefly shown that the principle of transfinite induction up to ε0 is derivable in second-order number theory.

More than half a century has passed with no constructive proof of theconsistency of full second-order arithmetic in sight. Early pioneers in this field included Kurt Schütte and Gaisi Takeuti. The former created in 1951 an infinitary sequent calculus to present consistency proofs in a perspicuous way, the latter instead used a more traditional Gentzen-style calculus (see Takeuti 1987).

In the current research in the proof theory of second-order arithmetic, one studies what are known as subsystems of second-order arithmetic. The strongest results as of today are, in a very brief outline, the following: letX range over number-theoretic predicates. A formula such asX(x) states thatx has the property expressed byX. We can now use first- andsecond-order logic to form compound formulas such asX(Xx ∨ ¬Xx). The collection of natural numbers for which such a formula with one universal second-order quantifier holds is called aΠ11-set (in this case, the whole ofthe natural numbers). More generally, a comprehension axiom is of theformXx(XxB(x)). If the formulaB has no second-order quantifiers, the axiom gives what is called arithmetic comprehension or ACA. IfB can have the formYZC(x) with no other second-order quantifiers, the special case ofΠ12-comprehension is obtained. Consistency proofs for a subsystem of second-order arithmetic withΠ12-comprehension have been given by Toshiyasu Arai and Michael Rathjen in themid-1990s. (see Rathjen 1995 for these developments).

6. Later developments in natural deduction

At the time when Gentzen worked out his system of natural deduction, Stanislaw Jaskowski was also developing a logical system for reasoning with assumptions. Formulas in derivations are arranged in alinear succession, but Jaskowski's paper of 1934 remained fragmentaryand without substantial results such as a subformula property. The linear variant of natural deduction is followed in many pedagogical expositions of elementary logic (sometimes called “Fitch systems”). Gentzen found Jaskowski's work by June 1936, when both were in Münster, and considered its linear arrangement of formulas an improvement, a “liberation from the straitjacket ofthe tree form”, into one that reflects “the linearity of thinking” (the former from unpublished notes, the latter from Gentzen's thesis).

The system of natural deduction lay mostly dormant for some thirty years, until the thesis of Dag Prawitz of 1965,Natural Deduction:A Proof-Theoretical Study. The order in which Prawitz presented the normalization theorem was different from the one in Gentzen's early thesis manuscript. Prawitz gave first a normalization theorem and subformula property for a system of natural deduction for classical logic. This system contains no disjunction or existence. Ina second stage, he considered intuitionistic natural deduction for the full language of predicate logic and reduced its normalization tothe deletion of detour convertibilities as in the fragment of classical logic. When Gentzen's proof of normalization came to light in 2005, Prawitz said, in conversation with the present author, that it is clear that Gentzen knew the result, because the remarks in the printed thesis are so suggestive.

In the late 1960s,strong normalization became an issue: Prawitz, using previous work of William Tait and Jean-Yves Girard, proved in 1971 that non-normalities in a derivation can be converted in any order, with a terminating normalization process and a unique normal derivation as a result. Gentzen seems not to have noticed the latter, but seems to have thought rather the contrary, by the failureof this property for the elimination of cuts in sequent calculus.

At about the same time as strong normalization was studied, theCurry-Howard correspondence emerged. Curry had observed in hiswork on combinatory logic in the late 1950s the analogy betweenimplication elimination in natural deduction and functionalapplication (Curry and Feys 1958). The idea was as old asintuitionistic logic: by the “BHK-explanation” of theconnectives and quantifiers (for Brouwer-Heyting-Kolmogorov), theforms of propositions in intuitionistic logic express prescriptions onhow to prove those propositions: aconjunctionA &B is proved by provingA andBseparately, a disjunctionAB by proving one ofA andB, and animplicationAB byshowing how to convert any proof ofA into some proof ofB, and so on. These explanations come very close to theintroduction rules of natural deduction, but it remains unknown whattheir effect on Gentzen's thought was.

The Curry-Howard correspondence, from a paper by William Howard of 1969, but published only in 1980, is based on the “formulas-as-types” principle, or in another jargon, on the “propositions-as-sets” principle. A proposition is thought of as its set of proofs. Truth of a proposition corresponds to the non-emptyness of the set. Proofs ofAB are now functions from (proofs of)A to (proofs of)B andAB itself the set of such functions. Thus, iff :AB anda :A, then functional application givesf(a) :B. The reverse, corresponding to the introduction of an implication, is captured by the principle of functional abstraction of Alonzo Church's λ-calculus.

The Curry-Howard correspondence has made intuitionistic natural deduction part of the computer science curriculum: it gives a computational semantics for intuitionistic logic in which computations, and the executions of programs more generally, are effected through normalization. A proof of an implicationAB, say, is a program that converts data of typeA into an output of typeB. The construction of an object (proof, function, program)f of the typeAB ends with an abstraction. When an objecta of typeA is fed intof as an argument, the resulting expression is not normal, but has a form that corresponds to an introduction followed by an elimination. Normalization now is the same as the execution of the programf. The use of intuitionistic logic is not tied to any intuitionistic philosophy of mathematics, but is just a systematic guarantee for the termination of the execution of computerprograms.

7. Sequent Calculus: later developments/applications

Gentzen's doctoral thesis marked the birth of structural proof theory, as contrasted to the old axiomatic proof theory of Hilbert. Aremarkable step ahead in the development of systems of sequent calculus was taken by Oiva Ketonen in his doctoral thesis of 1944. Ketonen, a student of mathematics and philosophy in Helsinki, went toGöttingen in 1938 to study proof theory with Gentzen—being the closest to a student the latter ever had. The connection seems to have been established by Ketonen's philosophy professor EinoKaila who had met Gentzen in 1936 in Münster. Ketonen recollected later that Gentzen was “a sympathetic young man of few words” who gave him an introduction to the proof-theoretical systems and results. Ketonen's best-known discoveryis a sequent calculus for classical propositional logic the logical rules of which are all invertible, meaning that whenever a sequent isof a form that matches the conclusion of a logical rule, the corresponding premisses, defined uniquely from the given sequent and the rule, are also derivable. The reverse is immediate (just apply the rule). Rules L& and L⊃, for example, are modified into

A,B, Γ Δ  L&
A &B, Γ Δ
   
Γ Δ,A  B, Γ Δ  L⊃
AB, Γ Δ

There is just one left rule for conjunction (and dually just one right rule for disjunction). The left implication rule has what are called “shared contexts”: the assumptions and cases in the conclusion, except for the formula with the connective, are repeated identically in both premisses. Ketonen's idea was to define a system of proof search: one starts from a given sequent to be derived, chooses a formula in it, and writes the premisses of a rule that can conclude the given sequent. By invertibility, the question of derivability is replaced by one or two equivalent questions of derivability on simpler sequents. The new rules are needed to ensure uniquely defined premisses in such “root-first” decomposition.

Ketonen's proof of invertibility of the logical rules of his sequent calculus used the structural rule of cut. Later Kurt Schütte (1950) and Haskell Curry (1963) gave direct proofs of invertibility, the latter with the explicit result that the inversions areheightpreserving: if a given sequent is derivable in at mostnsteps, the premisses in a rule that can conclude that sequent also have a derivation in at mostn steps.

How much of Ketonen's work stems from suggestions on the part of Gentzen remains unknown, because no correspondence has been found. Ketonen writes in the preface of his thesis that “Dr. G. Gentzen of Göttingen directed me towards the problem area of this work”. The thesis was Ketonen's only original work in logic, saved from oblivion by a long review that Bernays wrote of it forThe Journal of Symbolic Logic in 1945.

One person who knew Ketonen's calculus in the late 1940s was Evert Beth. When Beth later, in 1955, presented his well-known tableau calculus, he seems to have forgotten the origin of the tableau calculus as a reformulation of the one of Ketonen, but refers insteadto Kleene's influentialIntroduction to Metamathematics of 1952. Kleene had taken up Ketonen's calculus from the Bernays' review and also treated intuitionistic sequent calculus in which invertibility is more restricted than in the classical calculus. WithKleene's book, Gentzen's sequent calculi became generally known and accessible.

Kleene's work of the early 1950s also pioneered a remarkable development in sequent calculus, namely the “contraction-free” classical and intuitionistic calculi today denoted byG3c andG3i. These calculi have the property that none of Gentzen's original “structural rules” are needed. The rule of “weakening” permits the addition of superfluous cases and assumptions, and the rule of “contraction” the deletion of one copy of a formula if two were present in a list, as in

WeakeningContraction
Γ Δ  Wk
A, Γ Δ
   
A,A, Γ Δ  Ctr
A, Γ Δ

Analogous rules permit weakening and contraction in the right, succedent parts of sequents. Weakening is made an eliminable rule by letting initial sequents have the formA, Γ Δ,A instead of Gentzen'sAA. Contraction is likewise made eliminable by a suitable formulation of the rules. The import is thatin root-first proof search, no rules need be applied that would produce a duplication of a formula in a premiss. Without this result,non-termination of proof search would not follow.

The classical calculus has the property, mentioned above, of height-preserving invertibility of its logical rules. Albert Dragalinrefined in the late 1970s the calculus into one in which the structural rules are moreover “height-preserving admissible”, meaning that whenever the premiss of such a rule is derivable, the conclusion is derivable without the rule and with at most the same size (maximum number of rule instances in a derivation branch) of derivation. This property has profound effects on cut elimination: in permuting cut up, Gentzen had to restore the original contexts (the Γ's and Δ's) through weakenings and contractions. With the height-preserving admissibility of these rules, the size of a derivation does not increase when the rules are applied. Dragalin gave also an intuitionistic multisuccedent calculuswith the same type of admissibility of the structural rules. Troelstra, finally, gave in the textbookBasic Proof Theory (2000, first ed. 1996) a single-succedent intuitionistic calculus with height-preserving admissibility of weakening and contraction. The contraction-free sequent calculi are powerful tools for the analysis of formal derivations. Many difficult research results in logic become just exercises through the control over the structure ofproofs that theG3-calculi permit.

The earliest application of sequent calculus in mathematics was in the proof theory of arithmetic, in Gentzen's thesis and in a decisiveway in the 1938 proof of the consistency of arithmetic. Troelstra mentions Ketonen's work as

an early analysis of cutfree proofsin Gentzen calculi with axioms; but he considers the form of cutfree derivations in the pure calculus where axioms are present in the antecedent of the sequents derived. (Troelstra and Schwichtenberg 2000: 142)

The axioms that Ketonen considers are those of projective and affine geometry, the former taken from Skolem's 1920 paper discussed in the first section above. Ketonen wanted to formulate Skolem's formal rules of proof within sequent calculus. However, Ketonen's work was mostly known only through its review by Bernays and only the logical part on sequent calculus was explained in detail there.

A second way to apply the sequent calculus is to let sequents thatbegin derivation branches have, in addition to initial sequents, alsothe formA,in whichA is an axiom, or an instance of a universalaxiom. Now, by Gentzen's “extended Hauptsatz”, cuts inderivations can be permuted up until one of their premisses is anaxiom, but these cuts on axioms remain. Another, newer method is toconvert axioms into extra rules that are added to the logical rules ofsequent calculus, with full cut elimination maintained (as explainedin Negri and von Plato 2001, chapter 6, and in Troelstra andSchwichtenberg's 2000, chapter 4.7).

8. The aims of proof theory

To what extent has proof theory achieved its original aims? For Hilbert, the aims were a complete clarification of the foundational problems through finitary proofs of consistency, etc, aims in which proof theory failed. Hilbert in his program was not interested in thestudy of mathematical proofs in themselves, but only in clearing the central foundational problems (and then to forget about them). A recently found note by Hilbert gives a different picture: the note states that Hilbert wanted to add as a 24th and last problem in his famous Paris list of open mathematical problems of 1900 the development of “a theory of proof methods in mathematics”. This was before his metamathematical program for the development of a proof theory emerged.

For Gentzen, the aims were, along with those of Hilbert, to understand the structure of mathematical proofs. This program was a complete success as far as pure logic and arithmetic are concerned. The methods of sequent calculus, especially, permit the analysis of proofs with profound results. The grand aim of proof theory, a proof of the consistency of analysis as in Hilbert's second Paris problem, has not been carried through but is not excluded, either.

Some understanding of the notion of proof is necessary for any mathematician, if for nothing else, then at least for the communicability of mathematical results: publication rests on the understanding that proofs can be made as explicit as to be routinely checkable for correctness. However, proof theory has so far not become a practical tool for the working mathematician; the applications in mathematics have been rather isolated cases. Recent work on formalizing mathematical proofs with computerized systems, calledproof editors, may gradually change this picture.

Proof theory has created new aims outside traditional mathematics, especially in connection with computer science. Topics such as the verification of correctness of computer programs are an outgrowth of proof theory. Natural deduction has led to the Curry-Howard correspondence and to connections with functional programming, and sequent calculus is often used in systems of automatic proof search, as in logic programming.

Bibliography

Texts on Proof Theory

  • Buss, Sam (ed.), 1998,Handbook of Proof Theory,Amsterdam: Elsevier.
  • Negri, S. and J. von Plato, 2001,Structural Proof Theory,Cambridge: Cambridge University Press.
  • von Plato, J., 2013,Elements of Logical Reasoning,Cambridge: Cambridge University Press.
  • Takeuti, G., 1987,Proof Theory, Amsterdam: North-Holland,2nd edition.
  • Troelstra, A. and H. Schwichtenberg, 2000,Basic ProofTheory, Cambridge: Cambridge University Press, 2nd edition.

Original Works and Their Reprints

  • Ackermann, W. 1940, “Zur Widerspruchsfreiheit derZahlentheorie,”Mathematische Annalen, 117: 162–194.
  • Beth, E., 1955,Semantic entailment and formal derivability(Mededelingen der Koninklijke Nederlandse Akademie vanWetenschappen. Afd. Letterkunde. Nieuwe reeks, deel 18, no. 13),Amsterdam: North-Holland.
  • Church, A., 1936, “A Note on theEntscheidungsproblem,”Journal of SymbolicLogic, 1: 40–41.
  • Curry, H., and Feys, R., 1958.Combinatory Logic.(Studies in Logic and the Foundations of Mathematics, Vol. I), 1stedition, Amsterdam: North-Holland.
  • Curry, H., 1963,Foundations of Mathematical Logic, New York: McGraw-Hill; reprinted New York: Dover, 1977.
  • Dragalin, A., 1988,Mathematical Intuitionism: Introduction toProof Theory, Providence, RI: American Mathematical Society.
  • Gentzen, G., 1934–1935,Untersuchungen über daslogische Schliessen (Investigations into LogicalInference), Ph.D. thesis, UniversitätGöttingen. Published in Gentzen 1969: 68–131.
  • Gentzen, G., 1938, “Neue Fassung desWiderspruchsfreiheitsbeweises für die reine Zahlentheorie”,Forschungen zur Logik und zur Grundlegung der exaktenWissenschaften,Neue Folge 4, S. Hrizel,19–44. Translated in Gentzen 1969: 252–286.
  • Gentzen, G., 1943, “Beweisbarkeit und Unbeweisbarkeit vonAnfangsfällen der transfiniten Induktion in der reinenZahlentheorie”,Mathematische Annalen, 119:252–286. Translated in Gentzen 1969: 287–308.
  • Gentzen, G., 1969,The Collected Papers of GerhardGentzen, ed. M. Szabo, Amsterdam: North-Holland.
  • Gentzen, G., 2008, “The normalization of derivations,”The Bulletin of Symbolic Logic, 14(1): 245–257.
  • Gödel, K., 1930, “Die Vollständigkeit der Axiomedes logischen Funktionenkalküls”,Monatshefte fürMathematik und Physik, 37: 349–360.
  • Gödel, K., 1958, “Über eine bisher noch nichtbenützte Erweiterung des finiten Standpunktes”,Dialectica, 12: 280–287.
  • Gödel. K., 1986–2003,Collected Papers (VolumesI–V), S. Feferman et al. (eds.), Oxford: Oxford UniversityPress.
  • van Heijenoort, J., 1967,From Frege to Gödel,Cambridge, MA: Harvard University Press.
  • Hilbert, D., 1899,Grundlagen der Geometrie, Leipzig:B.G. Teubner.
  • Hilbert, D., 1926, “Über das Unendliche”(“On the infinite”),Mathematische Annalen, 95:161–190. [Lecture given Münster, 4 June 1925.]
  • Hilbert, D., and Ackermann, W., 1928,Grundzüge dertheoretischen Logik, Berlin: Springer.
  • Hilbert, D., 1931, “Die Grundlegung der elementarenZahlenlehre,”Mathematische Annalen, 104: 484–494.
  • Howard, W., 1980 [1969], “The formulae-as-types notion of construction,” inJ. Seldin and J. Hindley (eds.),To H. B. Curry: Essays onCombinatory Logic, Lambda Calculus and Formalism,London, New York: Academic Press, pp. 480–490.
  • Jaskowski, S., 1934, “On the rules of supposition in formal logic,”in S. McCall (ed.),Polish Logic 1920–1939, Oxford:Clarendon Press, 1967, pp. 232–258.
  • Ketonen, O., 1944,Untersuchungen zumPrädikatenkalkül, Annales Academiae scientiarum fennicae(Ser. A.I. 23), Helsinki.
  • Kleene, S., 1952,Introduction to Metamathematics,Amsterdam: North-Holland.
  • Kreisel, G., 1951, “On the interpretation of non-finitist proofs:Part I,”The Journal of Symbolic Logic, 16(4): 241–267.
  • Löwenheim, L., 1915, “Über Möglichkeiten imRelativkalkül”,Mathematische Annalen, 76(4):447–470.
  • Menzler-Trott, E., 2001,Gentzens Problem, Berlin:Birkhäuser Verlag.
  • Menzler-Trott, E., 2007,Logic's Lost Genius: The Life and Workof Gerhard Gentzen, Providence, RI: American MathematicalSociety.
  • Prawitz, D., 1965,Natural Deduction: A Proof-TheoreticalStudy, Stockholm: Almqvist & Wiksell; reprint New York: Dover(with a new preface), 2006.
  • –––, 1971, “Ideas and results in proof theory,” inJ. Fenstad (ed.),Proceedings of the Second Scandinavian LogicSymposium, Amsterdam: North-Holland, pp. 235–308.
  • Rathjen, M., 1995, “Recent advances in ordinal analysis;Π12-CA and related systems,”The Bulletinof Symbolic Logic, 1(4): 468–485.
  • Schütte, K., 1950, “Schlussweisen-Kalküle derPrädikatenlogik,”Mathematische Annalen, 122:47–65.
  • –––, 1951, “Beweistheoretische Erfassung derunendlichen Induktion in der Zahlentheorie,”MathematischeAnnalen, 122: 369–389.
  • Skolem, T., 1920, “Logisch-kombinatorische Untersuchungen überdie Erfüllbarkeit oder Beweisbarkeit mathematischer Sätze,nebst einem Theoreme über dichte Mengen,” translated and reprintedinSelected Works in Logic, J.E. Fenstad (ed.), Oslo,Universitetsforlaget, 1970, pp. 103–136:
  • Whitehead, A.N. and B. Russell, 1910–1913,PrincipiaMathematica, Cambridge: Cambridge University Press.

Secondary literature

  • Bernays, P., 1945, “Review: Oiva Ketonen,Untersuchungenzum Pradikatenkalkul,”The Journal of SymbolicLogic, 10(4): 127–130.
  • –––, 1965, “Betrachtungen zumSequenzen-kalkul,” inContributions to Logic and Methodologyin Honor of J. M. Bochenski, Amsterdam: North-Holland,pp. 1–44.
  • –––, 1979, “On the original Gentzenconsistency proof for number theory,” in J. Myhill etal. (eds.),Intuitionism and Proof Theory, Amsterdam:North-Holland, pp. 409–417.
  • Feferman, S., 2000, “Highlights in proof theory,” inV. Hendricks et al. (eds.) 2000, 11–31.
  • Hempel, C., 2000, “An intellectual autobiography.” InScience, Explanation, and Rationality, ed. J. Fetzer,pp. 3-35.
  • Hendricks, V., et al. (eds.), 2000,Proof Theory: History andPhilosophical Significance, Dordrecht: Kluwer.
  • Mancosu, P., 1999, “Between Berlin and Vienna: The immediatereception of Gödel's incompleteness theorems,”History andPhilosophy of Logic, 20: 33–45.
  • von Plato, J., 2007, “In the shadows of the Löwenheim-Skolemtheorem: early combinatorial analyses of mathematical proofs,”The Bulletin of Symbolic Logic, 13(2): 189–225.
  • –––, 2007, “From Hilbert's programme toGentzen's programme,” in the Appendix, Menzler-Trott 2007.
  • –––, 2009, “Gentzen's logic,” inHandbook of the History and Philosophy of Logic (Volume 5),Amsterdam: Elsevier.
  • –––, 2012, “Gentzen's proof systems:byproducts in a program of genius,”The Bulletin of SymbolicLogic, 18(3): 313–367.
  • Smorynski, C., 2007, “Hilbert's programme,” in the Appendix,Menzler-Trott 2007.
  • Tait, W., 2005, “Gödel's reformulation of Gentzen's firstconsistency proof for arithmetic: the no-counterexampleinterpretation,”The Bulletin of Symbolic Logic, 11(2):225–238.
  • Troelstra, A. and Schwichtenberg, H., 2000,Basic ProofTheory, Cambridge: Cambridge University Press, 2nd edition.

Other Internet Resources

[Please contact the author with suggestions.]

Copyright © 2014 by
Jan von Plato

Open access to the SEP is made possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support
Please Read How You Can Help Keep the Encyclopedia Free

Browse

About

Support SEP

Mirror Sites

View this site from another server:

USA (Main Site)Philosophy, Stanford University

The Stanford Encyclopedia of Philosophy iscopyright © 2023 byThe Metaphysics Research Lab, Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054


[8]ページ先頭

©2009-2025 Movatter.jp