| |
General-elimination harmony articulates Gentzen’s idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given (...) of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies’ test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett’s notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable. (shrink) | |
The idea of an ?inversion principle?, and the name itself, originated in the work of Paul Lorenzen in the 1950s, as a method to generate new admissible rules within a certain syntactic context. Some fifteen years later, the idea was taken up by Dag Prawitz to devise a strategy of normalization for natural deduction calculi (this being an analogue of Gentzen's cut-elimination theorem for sequent calculi). Later, Prawitz used the inversion principle again, attributing it with a semantic role. Still working (...) in natural deduction calculi, he formulated a general type of schematic introduction rules to be matched ? thanks to the idea supporting the inversion principle ? by a corresponding general schematic Elimination rule. This was an attempt to provide a solution to the problem suggested by the often quoted note of Gentzen. According to Gentzen ?it should be possible to display the elimination rules as unique functions of the corresponding introduction rules on the basis of certain requirements?. Many people have since worked on this topic, which can be appropriately seen as the birthplace of what are now referred to as ?general elimination rules?, recently studied thoroughly by Sara Negri and Jan von Plato. In this study, we retrace the main threads of this chapter of proof-theoretical investigation, using Lorenzen's original framework as a general guide. (shrink) | |
What is predicativity? While the term suggests that there is a single idea involved, what the history will show is that there are a number of ideas of predicativity which may lead to different logical analyses, and I shall uncover these only gradually. A central question will then be what, if anything, unifies them. Though early discussions are often muddy on the concepts and their employment, in a number of important respects they set the stage for the further developments, and (...) so I shall give them special attention. NB. Ahistorically, modern logical and set-theoretical notation will be used throughout, as long as it does not conflict with original intentions. (shrink) | |
In proof-theoretic semantics the meaning of an atomic sentence is usually determined by a set of derivations in an atomic system which contain that sentence as a conclusion (see, in particular, Prawitz, 1971, 1973). The paper critically discusses this standard approach and suggests an alternative account which proceeds in terms of subatomic introduction and elimination rules for atomic sentences. A simple subatomic normal form theorem by which this account of the semantics of atomic sentences and the terms from which they (...) are composed is underpinned, shows moreover that the proof-theoretic analysis of first-order logic can be pursued also beneath the atomic level. (shrink) | |
We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational account of proof-theoretic harmony. With every set of introduction rules a canonical elimination (...) rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework. (shrink) | |
A general definition theory should serve as a foundation for the mathematical study of definitional structures. The central notion of such a theory is a precise explication of the intuitively given notion of a definitional structure. The purpose of this paper is to discuss the proof theory of partial inductive definitions as a foundation for this kind of a more general definition theory. Among the examples discussed is a suggestion for a more abstract definition of lambda-terms (derivations in natural deduction) (...) that could provide a basis for a more systematic definitional approach to general proof theory. (shrink) | |
Argument & Computation, Volume 3, Issue 2-3, Page 83-86, June–September 2012. | |
Resumen:En este artículo delineamos una propuesta para elaborar una lógica de las ficciones desde el enfoque lúdico del pragmatismo dialógico. En efecto, centrados en una de las críticas mayores al enfoque clásico de la lógica: la esquizofrenia estructural de su semántica (Lambert 2004: 142-143; 160), recorremos los compromisos ontológicos de las dos tradiciones mayores de la lógica (Aristóteles y Frege) para establecer sus posibilidades y límites en el análisis del discurso ficcional, y la superación desde una perspectiva lúdico pragmática.Palabras clave: (...) lógica; dialógica; cuadro de oposición; esquizofrenia; semántica lúdica; cuantificador existencial; nombres propios; intuicionismo; tercero excluido; doble negación; ficciones. (shrink) No categories | |
No categories | |
Kant's theory of arithmetic is not only a central element in his theoretical philosophy but also an important contribution to the philosophy of arithmetic as such. However, modern mathematics, especially non-Euclidean geometry, has placed much pressure on Kant's theory of mathematics. But objections against his theory of geometry do not necessarily correspond to arguments against his theory of arithmetic and algebra. The goal of this article is to show that at least some important details in Kant's theory of arithmetic can (...) be picked up, improved by reconstruction and defended under a contemporary perspective: the theory of numbers as products of rule following construction presupposing successive synthesis in time and the theory of arithmetic equations, sentences or "formulas"—as Kant says—as synthetic a priori. In order to do so, two calculi in terms of modern mathematics are introduced which formalise Kant's theory of addition as a form of synthetic operation. (shrink) | |
The ancient dualism of a sensible and an intelligible world important in Neoplatonic and medieval philosophy, down to Descartes and Kant, would seem to be supplanted today by a scientific view of mind-in-nature. Here, we revive the old dualism in a modified form, and describe mind as a symbolic language, founded in linguistic recursive computation according to the Church-Turing thesis, constituting a world L that serves the human organism as a map of the Universe U. This methodological distinction of L (...) vs. U helps to understand how and why structures of phenomena come to be opposed to their nature in human thought, a central topic in Heideggerian philosophy. U is uncountable according to Georg Cantor’s set theory but Language L, based on the recursive function system, is countable, and anchored in a Gray Area within U of observable phenomena, typically symbols, prelinguistic structures, genetic-historical records of their origins. Symbols, the phenomena most familiar to mathematicians, are capable of being addressed in L-processing. The Gray Area is the human Environment E, where we can live comfortably, that we manipulate to create our niche within hostile U, with L offering overall competence of the species to survive. The human being is seen in the light of his or her linguistic recursively computational mind. Nature U, by contrast, is the unfathomable abyss of being, infinite labyrinth of darkness, impenetrable and hostile to man. The U-man, biological organism, is a stranger in L-man, the mind-controlled rational person, as expounded by Saint Paul. Noumena can now be seen to reside in L, and are not fully supported by phenomena. Kant’s noumenal cause is the mental L-image of only partly phenomenal causation. Mathematics occurs naturally in pre-linguistic phenomena, including natural laws, which give rise to pure mathematical structures in the world of L. Mathematical foundation within philosophy is reversed to where natural mathematics in the Gray Area of pre-linguistic phenomena can be seen to be a prerequisite for intellectual discourse. Lesser, nonverbal versions of L based on images are shared with animals. (shrink) | |
“The confusion of a logical with a real predicate,” according to the Critique of Pure Reason, “is almost beyond correction”. Kant did not assert that existence is no predicate, but that it is only a “logical” one, and not a “real” one. Much the same thing has been said about identity, although Kant himself thought it is real and not logical. We have long lacked a rigorous criterion to distinguish real from logical predicates, and hence have not been able to (...) say why the difference matters. This paper has two objects. First it provides a demarcation between real and logical predicates that confirms Kant's dictum that existence is only “logical.” Secondly it states the theory of a “logical” relation of identity. Perhaps this is not the only identity relation. I show only that once it has been precisely defined in the right setting, there are definite answers to a number of disputed questions about identity. Maybe there are other concepts of identity for which different answers are to be given, but I shall not discuss that disagreeable prospect here. A third application concerns the ontological argument. (shrink) | |
There is an ambiguity in the concept of deductive validity that went unnoticed until the middle of the twentieth century. Sometimes an inference rule is called valid because its conclusion is a theorem whenever its premises are. But often something different is meant: The rule's conclusion follows from its premises even in the presence of other assumptions. In many logical environments, these two definitions pick out the same rules. But other environments are context-sensitive, and in these environments the second notion (...) is stronger. Sorting out this ambiguity has led to profound mathematical investigations with applications in complexity theory and computer science. The origins of this ambiguity and the history of its resolution deserve philosophical attention, because our understanding of logic stands to benefit from their details. I am eager to examine together with you, Crito, whether this argument will appear in any way different to me in my present circumstances, or whether it remains the same, whet... (shrink) | |
The main goal of quantum logic is the bottom-up reconstruction of quantum mechanics in Hilbert space. Here we discuss the question whether quantum logic is an empirical structure or a priori valid. There are good reasons for both possibilities. First, with respect to the possibility of a rational reconstruction of quantum mechanics, quantum logic follows a priori from quantum ontology and can thus not be considered as a law of nature. Second, since quantum logic allows for a reconstruction of quantum (...) mechanics, self-referential consistency requires that the empirical content of quantum mechanics must be compatible with the presupposed quantum ontology. Hence, quantum ontology contains empirical components that are also contained in quantum logic. Consequently, in this sense quantum logic is also a law of nature. (shrink) | |
We study a generalization of the standard syntax and game-theoretic semantics of logic, which is based on a duality between two players, to a multiplayer setting. We define propositional and modal languages of multiplayer formulas, and provide them with a semantics involving a multiplayer game. Our focus is on the notion of equivalence between two formulas, which is defined by saying that two formulas are equivalent if under each valuation, the set of players with a winning strategy is the same (...) in the two respective associated games. We provide a derivation system which enumerates the pairs of equivalent formulas, both in the propositional case and in the modal case. Our approach is algebraic: We introduce multiplayer algebras as the analogue of Boolean algebras, and show, as the corresponding analog to Stone’s theorem, that these abstract multiplayer algebras can be represented as concrete ones which capture the game-theoretic semantics. For the modal case we prove a similar result. We also address the computational complexity of the problem whether two given multiplayer formulas are equivalent. In the propositional case, we show that this problem is co-NP-complete, whereas in the modal case, it is PSPACE-hard. (shrink) | |
No categories | |
Protoscience and Reconstruction. A central concept of the constructivist philosophy of science is the term 'protoscience'. From an orthodox point of view, protosciences are bound to give the so called 'measurement-theoretical Apriori' for a science. Protophysics for example defines the quantities 'length', 'time', and 'mass'. Thereby it yields some basic physical laws, which usually are regarded as "laws of nature", but in fact follow already from the definitions of the basic quantities. The attempt to establish other protodisciplines than protophysics is (...) traditionally regarded as not very promising, because other sciences do not like physics build their main theories on certain basic quantities. Nevertheless such enterprises like "protochemistry", "protobiology" and "protopsychology" recently appeared on the scene. Does this mark a breakthrough in constructivist philosophy of science or is this multiplication of protosciences no more than a promotion strategy? In the article it is shown that the orthodox definition of 'protoscience' is in fact far to narrow. An alternative definition is proposed which on one hand preserves the classic tasks of protophysics but on the other hand allows for other protosciences as equally useful enterprises. A central concept within the complex topic "protoscience" is the one of 'reconstruction'. It can be shown that there is a certain ambiguity in the use of this critical concept. Therefore the article ends with a reconstruction of the term 'reconstruction'. (shrink) | |
The article deals with Cantor's argument for the non-denumerability of reals somewhat in the spirit of Lakatos' logic of mathematical discovery. At the outset Cantor's proof is compared with some other famous proofs such as Dedekind's recursion theorem, showing that rather than usual proofs they are resolutions to do things differently. Based on this I argue that there are "ontologically" safer ways of developing the diagonal argument into a full-fledged theory of continuum, concluding eventually that famous semantic paradoxes based on (...) diagonal construction are caused by superficial understanding of what a name is. (shrink) | |
The article deals with Cantor’s diagonal argument and its alleged philosophical consequences such as that there are more reals than integers and, hence, that some of the reals must be independent of language because the totality of words and sentences is always count-able. My claim is that the main flaw of the argument for the existence of non-nameable objects or truths lies in a very superficial understanding of what a name or representation actually is. | |
The aim of this paper is to reconsider several proposals that have been put forward in order to develop a Proof-Theoretical Semantics, from the by now classical neo-verificationist approach provided by D. Prawitz and M. Dummett in the Seventies, to an alternative, more recent approach mainly due to the work of P. Schroeder-Heister and L. Hallnäs, based on clausal definitions. Some other intermediate proposals are very briefly sketched. Particular attention will be given to the role played by the so-called Fundamental (...) Assumption. We claim that whereas, in the neo-verificationist proposal, the condition expressed by that Assumption is necessary to ensure the completeness of the justification procedure ( from the outside , so to speak), within the definitional framework it is a built-in feature of the proposal. The latter approach, therefore, appears as an alternative solution to the problem which prompted the neo-verificationists to introduce the Fundamental Assumption. (shrink) | |
The purpose of this brief note is to prove a limitative theorem for a generalization of the deduction theorem. I discuss the relationship between the deduction theorem and rules of inference. Often when the deduction theorem is claimed to fail, particularly in the case of normal modal logics, it is the result of a confusion over what the deduction theorem is trying to show. The classic deduction theorem is trying to show that all so-called ‘derivable rules’ can be encoded into (...) the object language using the material conditional. The deduction theorem can be generalized in the sense that one can attempt to encode all types of rules into the object language. When a rule is encoded in this way I say that it is reflected in the object language. What I show, however, is that certain logics which reflect a certain kind of rule must be trivial. Therefore, my generalization of the deduction theorem does fail where the classic deduction theorem didn’t. (shrink) | |
What is a minimal proof-theoretical foundation of logic? Two different ways to answer this question may appear to offer themselves: reduce the whole of logic either to the relation of inference, or else to the property of incompatibility. The first way would involve defining logical operators in terms of the algebraic properties of the relation of inference—with conjunction $$\hbox {A}\wedge \hbox {B}$$ A ∧ B as the infimum of A and B, negation $$\lnot \hbox {A}$$ ¬ A as the minimal (...) incompatible of A, etc. The second way involves introducing logical operators in terms of the relation of incompatibility, such that X is incompatible with $$\{\lnot \hbox {A}\}$$ { ¬ A } iff every Y incompatible with X is incompatible with {A}; and X is incompatible with $$\{\hbox {A}\!\wedge \!\hbox {B}\}$$ { A ∧ B } iff X is incompatible with {A,B}; etc. Whereas the first route leads us naturally to intuitionistic logic, the second leads us to classical logic. The aim of this paper is threefold: to investigate the relationship of the two approaches within a very general framework, to discuss the viability of erecting logic on such austere foundations, and to find out whether choosing one of the ways we are inevitably led to a specific logical system. (shrink) | |
Recent literature on dialogical logic discusses the case of tonk and the notion harmony in the context of a rule-based theory of meaning. Now, since the publications of those papers, a dialogical version of constructive type theory has been developed. The aim of the present paper is to show that, from the dialogical point of view, the harmony of the CTT-rules is the consequence of a more fundamental level of meaning characterized by the independence of players. We hope that the (...) following paper will contribute to a better understanding of the dialogical notion of meaning. (shrink) | |
No categories |