| |
A consequence relation is strongly classical if it has all the theorems and entailments of classical logic as well as the usual meta-rules (such as Conditional Proof). A consequence relation is weakly classical if it has all the theorems and entailments of classical logic but lacks the usual meta-rules. The most familiar example of a weakly classical consequence relation comes from a simple supervaluational approach to modelling vague language. This approach is formally equivalent to an account of logical consequence according (...) to which α1, ..., αn entails β just in case □α1, ..., □αn entails □β in the modal logic S5. This raises a natural question: If we start with a different underlying modal logic, can we generate a strongly classical logic? This paper explores this question. In particular, it discusses four related technical issues: (1) Which base modal logics generate strongly classical logics and which generate weakly classical logics? (2) Which base logics generate themselves? (3) How can we directly characterize the logic generated from a given base logic? (4) Given a logic that can be generated, which base logics generate it? The answers to these questions have philosophical interest. They can help us to determine whether there is a plausible supervaluational approach to modelling vague language that yields the usual meta-rules. They can also help us to determine the feasibility of other philosophical projects that rely on an analogous formalism, such as the project of defining logical consequence in terms of the preservation of an epistemic status. (shrink) | |
This article offers an overview of inferential role semantics. We aim to provide a map of the terrain as well as challenging some of the inferentialist’s standard commitments. We begin by introducing inferentialism and placing it into the wider context of contemporary philosophy of language. §2 focuses on what is standardly considered both the most important test case for and the most natural application of inferential role semantics: the case of the logical constants. We discuss some of the (alleged) benefits (...) of logical inferentialism, chiefly with regards to the epistemology of logic, and consider a number of objections. §3 introduces and critically examines the most influential and most fully developed form of global inferentialism: Robert Brandom’s inferentialism about linguistic and conceptual content in general. Finally, in §4 we consider a number of general objections to IRS and consider possible responses on the inferentialist’s behalf. (shrink) | |
Logic is usually thought to concern itself only with features that sentences and arguments possess in virtue of their logical structures or forms. The logical form of a sentence or argument is determined by its syntactic or semantic structure and by the placement of certain expressions called “logical constants.”[1] Thus, for example, the sentences Every boy loves some girl. and Some boy loves every girl. are thought to differ in logical form, even though they share a common syntactic and semantic (...) structure, because they differ in the placement of the logical constants “every” and “some”. By contrast, the sentences Every girl loves some boy. and Every boy loves some girl. are thought to have the same logical form, because “girl” and “boy” are not logical constants. Thus, in order to settle questions about logical form, and ultimately about which arguments are logically valid and which sentences logically true, we must distinguish the “logical constants” of a language from its nonlogical expressions. (shrink) | |
There have been several different and even opposed conceptions of the problem of logical constants, i.e. of the requirements that a good theory of logical constants ought to satisfy. This paper is in the first place a survey of these conceptions and a critique of the theories they have given rise to. A second aim of the paper is to sketch some ideas about what a good theory would look like. A third aim is to draw from these ideas and (...) from the preceding survey the conclusion that most conceptions of the problem of logical constants involve requirements of a philosophically demanding nature which are probably not satisfiable by any minimally adequate theory. (shrink) | |
According to logical inferentialists, the meanings of logical expressions are fully determined by the rules for their correct use. Two key proof-theoretic requirements on admissible logical rules, harmony and separability, directly stem from this thesis—requirements, however, that standard single-conclusion and assertion-based formalizations of classical logic provably fail to satisfy :1035–1051, 2011). On the plausible assumption that our logical practice is both single-conclusion and assertion-based, it seemingly follows that classical logic, unlike intuitionistic logic, can’t be accounted for in inferentialist terms. In (...) this paper, I challenge orthodoxy and introduce an assertion-based and single-conclusion formalization of classical propositional logic that is both harmonious and separable. In the framework I propose, classicality emerges as a structural feature of the logic. (shrink) | |
a single life-span. Philosophers, then, do not see more or know more, and they do not see less or know less. They aim to see less detail and more of the abstract. Their details, if you like, are abstractions. Walking on God’s earth as a pedestrian, as a farmer working his fields or as a passer-by, one’s picture of one’s surroundings is every bit as intelligent as that of the pilot riding the sky. The views of the field are radically (...) different, however. One sees only a specific field and in all lively detail: the exact pattern of the land, or even the exact outline of a given leaf, grasshopper, grain of sand even. Acquaintance with minute detail is not without its price: details may stand in the way of conjuring the big picture. It may be difficult to compare whichever field one happens to be in with far off fields, with respect to their size or shape or any other quality. One may wish to inquire if far off fields were already planted, harvested, or even if they exist. A pedestrian may find it hard or even impossible to do so. The pedestrian view contains fine points that the pilot’s map never would, but it does not necessarily contain more information, for it lacks the general context. After all, there are only so many items that one can observe and account for at a single glance, a single map, a single book, a single life-span. (shrink) No categories | |
This chapter focuses on alternative logics. It discusses a hierarchy of logical reform. It presents case studies that illustrate particular aspects of the logical revisionism discussed in the chapter. The first case study is of intuitionistic logic. The second case study turns to quantum logic, a system proposed on empirical grounds as a resolution of the antinomies of quantum mechanics. The third case study is concerned with systems of relevance logic, which have been the subject of an especially detailed reform (...) program. Finally, the fourth case study is paraconsistent logic, perhaps the most controversial of serious proposals. (shrink) | |
Logicism is, roughly speaking, the doctrine that mathematics is fancy logic. So getting clear about the nature of logic is a necessary step in an assessment of logicism. Logic is the study of logical concepts, how they are expressed in languages, their semantic values, and the relationships between these things and the rest of our concepts, linguistic expressions, and their semantic values. A logical concept is what can be expressed by a logical constant in a language. So the question “What (...) is logic?” drives us to the question “What is a logical constant?” Though what follows contains some argument, limitations of space constrain me in large part to express my Credo on this topic with the broad brush of bold assertion and some promissory gestures. (shrink) | |
K. R. Popper distinguished between two main uses of logic, the demonstrational one, in mathematical proofs, and the derivational one, in the empirical sciences. These two uses are governed by the following methodological constraints: in mathematical proofs one ought to use minimal logical means (logical minimalism), while in the empirical sciences one ought to use the strongest available logic (logical maximalism). In this paper I discuss whether Popper’s critical rationalism is compatible with a revision of logic in the empirical sciences, (...) given the condition of logical maximalism. Apparently, if one ought to use the strongest logic in the empirical sciences, logic would remain immune to criticism and, thus, non-revisable. I will show that critical rationalism is theoretically compatible with a revision of logic in the empirical sciences. However, a question that remains to be clarified by the critical rationalists is what kind of evidence would lead them to revise the system of logic that underlies a physical theory, such as quantum mechanics? Popper’s falsificationist methodology will be compared with the recently advocated extension of the abductive methodology from the empirical sciences to logic by T. Williamson, since both of them arrive at the same conclusion concerning the status of classical logic. (shrink) | |
Simon Evnine examines various epistemic aspects of what it is to be a person. Persons are defined as finite beings that have beliefs, including second-order beliefs about their own and others' beliefs, and are agents, capable of making long-term plans. It is argued that for any being meeting these conditions, a number of epistemic consequences obtain. First, all such beings must have certain logical concepts and be able to use them in certain ways. Secondly, there are at least two principles (...) governing belief that it is rational for persons to satisfy and are such that nothing can be a person at all unless it satisfies them to a large extent. These principles are that one believe the conjunction of one's beliefs and that one treat one's future beliefs as, by and large, better than one's current beliefs. Thirdly, persons both occupy epistemic points of view on the world and show up within those views. This makes it impossible for them to be completely objective about their own beliefs. Ideals of rationality that require such objectivity, while not necessarily wrong, are intrinsically problematic for persons. This "aspectual dualism" is characteristic of treatments of persons in the Kantian tradition. In sum, these epistemic consequences support a traditional view of the nature of persons, one in opposition to much recent theorizing. (shrink) | |
This paper deals with Popper's little-known work on deductive logic, published between 1947 and 1949. According to his theory of deductive inference, the meaning of logical signs is determined by certain rules derived from ?inferential definitions? of those signs. Although strong arguments have been presented against Popper's claims (e.g. by Curry, Kleene, Lejewski and McKinsey), his theory can be reconstructed when it is viewed primarily as an attempt to demarcate logical from non-logical constants rather than as a semantic foundation for (...) logic. A criterion of logicality is obtained which is based on conjunction, implication and universal quantification as fundamental logical operations. (shrink) | |
The concept of “necessity of thought” plays a central role in Dag Prawitz’s essay “Logical Consequence from a Constructivist Point of View” (Prawitz 2005). The theme is later developed in various articles devoted to the notion of valid inference (Prawitz, 2009, forthcoming a, forthcoming b). In section 1 I explain how the notion of necessity of thought emerges from Prawitz’s analysis of logical consequence. I try to expound Prawitz’s views concerning the necessity of thought in sections 2, 3 and 4. (...) In sections 5 and 6 I discuss some problems arising with regard to Prawitz’s views. (shrink) | |
This volume is dedicated to Prof. Dag Prawitz and his outstanding contributions to philosophical and mathematical logic. Prawitz's eminent contributions to structural proof theory, or general proof theory, as he calls it, and inference-based meaning theories have been extremely influential in the development of modern proof theory and anti-realistic semantics. In particular, Prawitz is the main author on natural deduction in addition to Gerhard Gentzen, who defined natural deduction in his PhD thesis published in 1934. The book opens with an (...) introductory paper that surveys Prawitz's numerous contributions to proof theory and proof-theoretic semantics and puts his work into a somewhat broader perspective, both historically and systematically. Chapters include either in-depth studies of certain aspects of Dag Prawitz's work or address open research problems that are concerned with core issues in structural proof theory and range from philosophical essays to papers of a mathematical nature. Investigations into the necessity of thought and the theory of grounds and computational justifications as well as an examination of Prawitz's conception of the validity of inferences in the light of three “dogmas of proof-theoretic semantics” are included. More formal papers deal with the constructive behaviour of fragments of classical logic and fragments of the modal logic S4 among other topics. In addition, there are chapters about inversion principles, normalization of p roofs, and the notion of proof-theoretic harmony and other areas of a more mathematical persuasion. Dag Prawitz also writes a chapter in which he explains his current views on the epistemic dimension of proofs and addresses the question why some inferences succeed in conferring evidence on their conclusions when applied to premises for which one already possesses evidence. (shrink) | |
Karl Popper developed a theory of deductive logic in the late 1940s. In his approach, logic is a metalinguistic theory of deducibility relations that are based on certain purely structural rules. Logical constants are then characterized in terms of deducibility relations. Characterizations of this kind are also called inferential definitions by Popper. In this paper, we expound his theory and elaborate some of his ideas and results that in some cases were only sketched by him. Our focus is on Popper's (...) notion of duality, his theory of modalities, and his treatment of different kinds of negation. This allows us to show how his works on logic anticipate some later developments and discussions in philosophical logic, pertaining to trivializing connectives, the duality of logical constants, dual-intuitionistic logic, the conservativeness of language extensions, the existence of a bi-intuitionistic logic, the non-logicality of minimal negation, and to the problem of logicality in general. (shrink) | |
This paper deals with the question of the logicality of modal logics from a proof-theoretic perspective. It is argued that if Dos̆en’s analysis of logical constants as punctuation marks is embraced, it is possible to show that all the modalities in the cube of normal modal logics are indeed logical constants. It will be proved that the display calculus for each displayable modality admits a purely structural presentation based on double-line rules which, following Dos̆en’s analysis, allows us to claim that (...) the corresponding modal operators are logical constants. (shrink) | |
This special issue collects together nine new essays on logical consequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two influential accounts of consequence, the model-theoretic and the proof-theoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We briefly discuss the main objections these accounts face, as well as Hartry Field’s contention that such objections (...) show consequenceto be a primitive, indefinable notion, and that we must reject the claim that valid arguments necessarily preserve truth. We suggest that the accountsin question have the resources to meet the objections standardly thought to herald their demise and make two main claims: (i) that consequence, as opposed to logical consequence, is the epistemologically significant relation philosophers should be mainly interested in; and (ii) that consequence is a paradoxical notion if truth is. (shrink) | |
The paper is a brief survey of some sequent calculi which do not follow strictly the shape of sequent calculus introduced by Gentzen. We propose the following rough classification of all SC: Systems which are based on some deviations from the ordinary notion of a sequent are called generalised; remaining ones are called ordinary. Among the latter we distinguish three types according to the proportion between the number of primitive sequents and rules. In particular, in one of these types, called (...) Gentzen’s type, we have a subtype of standard SC due to Gentzen. Hence by nonstandard ones we mean all these ordinary SC where other kinds of rules are applied than those admitted in standard Gentzen’s sequent calculi. We describe briefly some of the most interesting or important nonstandard SC belonging to the three abovementioned types. (shrink) | |
The aim of this essay is a criticism of reductionism ? both in its ?static? interpretation (usually referred to as the layer model or level?picture of science) and in its ?dynamic? interpretation (as a theory of the growth of scientific knowledge), with emphasis on the latter ? from the point of view of Popperian fallibilism and Feyerabendian pluralism, but without being committed to the idiosyncrasies of these standpoints. In both aspects of criticism, the rejection is based on the proposal of (...) a global alternative. Hummell and Opp's research programme for the reduction of sociology to psychology is used as a starting?point and taken as the primary object of criticism. Following the introductory Section I, Section II analyses the three crucial notions of Hummell and Opp's research programme ? their explications of the notions of ?sociology?, ?psychology? and the concept of reduction itself ? and criticizes the authors? deficient ?logic of reduction?. Although the ?local? shortcomings of our authors? ?logic of reduction? do not affect reductionism as such, i.e. logically sound versions of reductionism as devised by Kemeny, Nagel, Oppenheim, Putnam, Woodger et al., it is argued that the logical soundness of sophisticated reductionism cannot compensate for its additional epistemological and methodological deficiencies. Section III analyses the ?dynamic? interpretation of reductionism as a particular developmental pattern of scientific growth. It is argued that even reductionism at its best can produce only cumulative progress, thus ?a priori? excluding scientific revolutions which are inevitably counter?inductive as well as counter?reductive. Section IV discusses the philosophical background of modern reductionism, and examines the effects both of reductionism and of anti?reductionistic pluralism on the autonomy of scientific fields. It is argued that pluralistic anti?reductionism undermines spurious claims for autonomy much more effectively than reductionism. As a ?local? improvement of the reductionistic research programme, the replacement of the predominant one?way reductionism by a less restrictive many?way reductionism is proposed. It is argued that the appropriate treatment for an allegedly backward science (say sociology) is not its reduction to an allegedly more advanced science (say psychology) but its non?reductive replacement by new theories (of the same or of another field) that do not incorporate the older ones. As a ?global? alternative to the reduction of sociology to psychology, the frontier?crossing direct application of psychological theories to sociological phenomena is proposed. A plea is made for a pluralistic science without reduction, based on intra? and interscientific criticism as the proper method for the advancement of knowledge. (shrink) | |
From the pre-Socratics to the present, one primary aim of philosophy has been to learn from arguments. Philosophers have debated whether we could indeed do this, but they have by and large agreed on how we would use arguments if learning from argument was at all possible. They have agreed that we could learn from arguments either by starting with true premises and validly deducing further statements which must also be true and therefore constitute new knowledge, or that we could (...) start from putative premises and validly deduce false consequences thereby showing that our premises were false. Our aim in this paper is to suggest a third alternative: we can learn from plausible arguments through criticism of such arguments which enable us to discover new problems. (shrink) | |
Gregory Landini offers a new and an illuminating reading of Ludwig Wittgenstein’s idea about his own innovation: it is the invention of a notation that removes the mystery from all theorems of logic and of mathematics as it renders their proofs part of their wordings. This makes all theorems in principle as boring as “all four-legged animals are animals.” This idea is Wittgenstein’s doctrine of showing. It is worthless; yet, as Landini shows, every time Wittgenstein offered an elaboration on it, (...) Russell checked it carefully and found it of no value. This, let me add, shows that Russell was in error in suggesting that intellectually there is no “advantage of theft over honest toil”: at times one may pay back and with high interest. Other cases may be due to misjudgment rather than to sloth. (shrink) | |
The formulas-as-types isomorphism tells us that every proof and theorem, in the intuitionistic implicational logic $H_\rightarrow$, corresponds to a lambda term or combinator and its type. The algorithms of Bunder very efficiently find a lambda term inhabitant, if any, of any given type of $H_\rightarrow$ and of many of its subsystems. In most cases the search procedure has a simple bound based roughly on the length of the formula involved. Computer implementations of some of these procedures were done in Dekker. (...) In this paper we extend these methods to full classical propositional logic as well as to its various subsystems. This extension has partly been implemented by Oostdijk. (shrink) | |
Recent literature on dialogical logic discusses the case of tonk and the notion harmony in the context of a rule-based theory of meaning. Now, since the publications of those papers, a dialogical version of constructive type theory has been developed. The aim of the present paper is to show that, from the dialogical point of view, the harmony of the CTT-rules is the consequence of a more fundamental level of meaning characterized by the independence of players. We hope that the (...) following paper will contribute to a better understanding of the dialogical notion of meaning. (shrink) |