| |
Gödel initiated the program of finding and justifying axioms that effect a significant reduction in incompleteness and he drew a fundamental distinction between intrinsic and extrinsic justifications. Reflection principles are the most promising candidates for new axioms that are intrinsically justified. Taking as our starting point Tait’s work on general reflection principles, we prove a series of limitative results concerning this approach. These results collectively show that general reflection principles are either weak ) or inconsistent. The philosophical significance of these (...) results is discussed. (shrink) | |
Gödel regarded the Dialectica interpretation as giving constructive content to intuitionism, which otherwise failed to meet reasonable conditions of constructivity. He founded his theory of primitive recursive functions, in which the interpretation is given, on the concept of computable function of finite type. I will (1) criticize this foundation, (2) propose a quite different one, and (3) note that essentially the latter foundation also underlies the Curry-Howard type theory, and hence Heyting's intuitionistic conception of logic. Thus the Dialectica interpretation (in (...) so far as its aim was to give constructive content to intuitionism) is superfluous. (shrink) | |
The last section of “Lecture at Zilsel’s” [9, §4] contains an interesting but quite condensed discussion of Gentzen’s first version of his consistency proof for P A [8], reformulating it as what has come to be called the no-counterexample interpretation. I will describe Gentzen’s result (in game-theoretic terms), fill in the details (with some corrections) of Godel's reformulation, and discuss the relation between the two proofs. | |
Ranging from Alan Turing’s seminal 1936 paper to the latest work on Kolmogorov complexity and linear logic, this comprehensive new work clarifies the relationship between computability on the one hand and constructivity on the other. The authors argue that even though constructivists have largely shed Brouwer’s solipsistic attitude to logic, there remain points of disagreement to this day. Focusing on the growing pains computability experienced as it was forced to address the demands of rapidly expanding applications, the content maps the (...) developments following Turing’s ground-breaking linkage of computation and the machine, the resulting birth of complexity theory, the innovations of Kolmogorov complexity and resolving the dissonances between proof theoretical semantics and canonical proof feasibility. Finally, it explores one of the most fundamental questions concerning the interface between constructivity and computability: whether the theory of recursive functions is needed for a rigorous development of constructive mathematics. This volume contributes to the unity of science by overcoming disunities rather than offering an overarching framework. It posits that computability’s adoption of a classical, ontological point of view kept these imperatives separated. In studying the relationship between the two, it is a vital step forward in overcoming the disagreements and misunderstandings which stand in the way of a unifying view of logic. (shrink) | |
This dissertation makes two primary contributions. The first three chapters develop an interpretation of Carnap's Meta-Philosophical Program which places stress upon his methodological analysis of the sciences over and above the Principle of Tolerance. Most importantly, I suggest, is that Carnap sees philosophy as contiguous with science—as a part of the scientific enterprise—so utilizing the very same methods and subject to the same limitations. I argue that the methodological reforms he suggests for philosophy amount to philosophy as the explication of (...) the concepts of science through the construction and use of suitably robust meta-logical languages. My primary interpretive claim is that Carnap's understanding of logic and mathematics as a set of formal auxiliaries is premised upon this prior analysis of the character of logico-mathematical knowledge, his understanding of its role in the language of science, and the methods used by practicing mathematicians. Thus the Principle of Tolerance, and so Carnap's logical pluralism, is licensed and justified by these methodological insights. This interpretation of Carnap's program contrasts with the popular Deflationary reading as proposed in Goldfarb & Ricketts. The leading idea they attribute to Carnap is a Logocentrism: That philosophical assertions are always made relative to some particular language, and that our choice of syntactical rules for a language are constitutive of its inferential structure and methods of possible justification. Consequently Tolerance is considered the foundation of Carnap's entire program. My third chapter argues that this reading makes Carnap's program philosophically inert, and I present significant evidence that such a reading is misguided. The final chapter attempts to extend the methodological ideals of Carnap's program to the analysis of the ongoing debate between category- and set-theoretic foundations for mathematics. Recent criticism of category theory as a foundation charges that it is neither autonomous from set theory, nor offers a suitable ontological grounding for mathematics. I argue that an analysis of concepts can be foundationally informative without requiring the construction of those concepts from first principles, and that ontological worries can be seen as methodologically unfruitful. (shrink) |