| |
The Bounds of Logic presents a new philosophical theory of the scope and nature of logic based on critical analysis of the principles underlying modern Tarskian logic and inspired by mathematical and linguistic development. Extracting central philosophical ideas from Tarski’s early work in semantics, Sher questions whether these are fully realized by the standard first-order system. The answer lays the foundation for a new, broader conception of logic. By generally characterizing logical terms, Sher establishes a fundamental result in semantics. Her (...) development of the notion of logicality for quantifiers and her work on branching are of great importance for linguistics. Sher outlines the boundaries of the new logic and points out some of the philosophical ramifications of the new view of logic for such issues as the logicist thesis, ontological commitment, the role of mathematics in logic, and the metaphysical underpinning of logic. She proposes a constructive definition of logical terms, reexamines and extends the notion of branching quantification, and discusses various linguistic issues and applications. (shrink) | |
Currie’s (2010) argument that “i-desires” must be posited to explain our responses to fiction is critically discussed. It is argued that beliefs and desires featuring ‘in the fiction’ operators—and not sui generis imaginings (or "i-beliefs" or "i-desires")—are the crucial states involved in generating fiction-directed affect. A defense of the “Operator Claim” is mounted, according to which ‘in the fiction’ operators would be also be required within fiction-directed sui generis imaginings (or "i-beliefs" and "i-desires"), were there such. Once we appreciate that (...) even fiction-directed sui generis imaginings would need to incorporate ‘in the fiction’ operators, the main appeal of the idea that sui generis imaginings (or "i-beliefs" or "i-desires") are at work in fiction-appreciation dissipates. [This is Chapter 10 of Explaining Imagination (OUP, 2020)]. (shrink) | |
No categories | |
This paper explores the nature of the concept of truth. It does not offer an analysis or definition of truth, or an account of how it relates to other concepts. Instead, it explores what sort of concept truth is by considering what sorts of thoughts it enables us to think. My conclusion is that truth is a part of each and every propositional thought. The concept of truth is therefore best thought of as the ability to token propositional thoughts. I (...) explore what implications this view has for existing accounts of concepts, and argue that truth is a concept unlike any other. (shrink) | |
Bilateral proof systems, which provide rules for both affirming and denying sentences, have been prominent in the development of proof-theoretic semantics for classical logic in recent years. However, such systems provide a substantial amount of freedom in the formulation of the rules, and, as a result, a number of different sets of rules have been put forward as definitive of the meanings of the classical connectives. In this paper, I argue that a single general schema for bilateral proof rules has (...) a reasonable claim to inferentially articulating the core meaning of all of the classical connectives. I propose this schema in the context of a bilateral sequent calculus in which each connective is given exactly two rules: a rule for affirmation and a rule for denial. Positive and negative rules for all of the classical connectives are given by a single rule schema, harmony between these positive and negative rules is established at the schematic level by a pair of elimination theorems, and the truth-conditions for all of the classical connectives are read off at once from the schema itself. (shrink) | |
The principle of compositionality requires that the meaning of a complex expression remains the same after substitution of synonymous expressions. Alleged counterexamples to compositionality seem to force a theoretical choice: either apparent synonyms are not synonyms or synonyms do not syntactically occur where they appear to occur. Some theorists have instead looked to Frege’s doctrine of “reference shift” according to which the meaning of an expression is sensitive to its linguistic context. This doctrine is alleged to retain the relevant claims (...) about synonymy and substitution while respecting the compositionality principle. Thus, Salmon (Philos Rev 115(4):415, 2006) and Glanzberg and King (Philosophers’ Imprint 20(2):1–29, 2020) offer occurrence-based accounts of variable binding, and Pagin and Westerståhl (Linguist Philos 33(5):381–415, 2010c) argue that an occurrence-based semantics delivers a compositional account of quotation. Our thesis is this: the occurrence-based strategies resolve the apparent failures of substitutivity in the same general way as the standard expression-based semantics do. So it is a myth that a Frege-inspired occurrence-based semantics affords a genuine alternative strategy. (shrink) | |
Frege famously claimed that variations in the sense of a proper name can sometimes be ‘tolerated’. In this paper, we offer a novel explanation of this puzzling claim. Frege, we argue, follows Trendelenburg in holding that we think in language—sometimes individually and sometimes together. Variations in sense can be tolerated in just those cases where we are using language to coordinate our actions but are not engaged in thinking together about an issue. | |
Rigorous proof is supposed to guarantee that the premises invoked imply the conclusion reached, and the problem of rigor may be described as that of bringing together the perspectives of formal logic and mathematical practice on how this is to be achieved. This problem has recently raised a lot of discussion among philosophers of mathematics. We survey some possible solutions and argue that failure to understand its terms properly has led to misunderstandings in the literature. | |
Comparatively easy questions we might ask about creativity are distinguished from the hard question of explaining transformative creativity. Many have focused on the easy questions, offering no reason to think that the imagining relied upon in creative cognition cannot be reduced to more basic folk psychological states. The relevance of associative thought processes to songwriting is then explored as a means for understanding the nature of transformative creativity. Productive artificial neural networks—known as generative antagonistic networks (GANs)—are a recent example of (...) how a system’s ability to generate novel products can both be finely tuned by prior experience and grounded in strategies that cannot be articulated by the system itself. Further, the kinds of processes exploited by GANs need not be seen as incorporating something akin to sui generis imaginative states. The chapter concludes with reflection on the added relevance of personal character to explanations of creativity. [This is Chapter 12 of the book Explaining Imagination.]. (shrink) | |
The book offers a proposal on how to define truth in all its complexity, without reductionism, showing at the same time which questions a theory of truth has to answer and which questions, although related to truth, do not belong within the scope of such a theory. Just like any other theory, a theory of truth has its structure and limits. The semantic core of the position is that truth-ascriptions are pro-forms, i.e. natural language propositional variables. The book also offers (...) an explanation of the syntactic behaviour of truth-terms, and the pragmatic roles of truth-acts. The theory of truth we present is a technical proposal, relatively uncontaminated by radical philosophical discussion. It makes indeed philosophical points, but it is intended to be a conceptual analysis, as neutral as possible, of the aspects that may serve as a point of departure for more philosophically-laden destinations. Like the Fregean account of quantifiers, which is prior to and, to a large extent, independent of the metaphysical debate about existence and its forms, or the Kaplanian view on demonstratives, that is neutral regarding the debate about individuals and the possibilities we have to actually “reach” them in a referring act, we intend our proposal about truth to be a sophisticated and explicative setting that helps to situate other debates about truth accurately, far from the distorted and sometimes strongly ideological discussions that have turned the topic into a paradigm of philosophical impasse. (shrink) | |
We re-examine the problem of existential import by using classical predicate logic. Our problem is: How to distribute the existential import among the quantified propositions in order for all the relations of the logical square to be valid? After defining existential import and scrutinizing the available solutions, we distinguish between three possible cases: explicit import, implicit non-import, explicit negative import and formalize the propositions accordingly. Then, we examine the 16 combinations between the 8 propositions having the first two kinds of (...) import, the third one being trivial and rule out the squares where at least one relation does not hold. This leads to the following results: (1) three squares are valid when the domain is non-empty; (2) one of them is valid even in the empty domain: the square can thus be saved in arbitrary domains and (3) the aforementioned eight propositions give rise to a cube, which contains two more (non-classical) valid squares and several hexagons. A classical solution to the problem of existential import is thus possible, without resorting to deviant systems and merely relying upon the symbolism of First-order Logic (FOL). Aristotle’s system appears then as a fragment of a broader system which can be developed by using FOL. (shrink) | |
[This Invited Paper will be published in December 2016.]. | |
The construction of a systematic philosophical foundation for logic is a notoriously difficult problem. In Part One I suggest that the problem is in large part methodological, having to do with the common philosophical conception of “providing a foundation”. I offer an alternative to the common methodology which combines a strong foundational requirement with the use of non-traditional, holistic tools to achieve this result. In Part Two I delineate an outline of a foundation for logic, employing the new methodology. The (...) outline is based on an investigation of why logic requires a veridical justification, i.e., a justification which involves the world and not just the mind, and what features or aspect of the world logic is grounded in. Logic, the investigation suggests, is grounded in the formal aspect of reality, and the outline proposes an account of this aspect, the way it both constrains and enables logic, logic's role in our overall system of knowledge, the relation between logic and mathematics, the normativity of logic, the characteristic traits of logic, and error and revision in logic. (shrink) | |
Predictive processing framework has found wide applications in cognitive science and philosophy. It is an attractive candidate for a unified account of the mind in which perception, action, and cognition fit together in a single model. However, PP cannot claim this role if it fails to accommodate an essential part of cognition—conceptual thought. Recently, Williams argued that PP struggles to address at least two of thought’s core properties—generality and rich compositionality. In this paper, I show that neither necessarily presents a (...) problem for PP. In particular, I argue that because we do not have access to cognitive processes but only to their conscious manifestations, compositionality may be a manifest property of thought, rather than a feature of the thinking process, and result from the interplay of thinking and language. Pace Williams, both of these capacities, constituting parts of a complex and multifarious cognitive system, may be fully based on the architectural principles of PP. Under the assumption that language presents a subsystem separate from conceptual thought, I sketch out one possible way for PP to accommodate both generality and rich compositionality. (shrink) | |
In a recent paper, “The Concept of Logical Consequence,” W. H. Hanson criticizes a formal-structural characterization of logical consequence in Tarski and Sher. Hanson accepts many principles of the formal-structural view. Relating to Sher 1991 and 1996a, he says. | |
The paper presents an outline of a unified answer to five questions concerning logic: (1) Is logic in the mind or in the world? (2) Does logic need a foundation? What is the main obstacle to a foundation for logic? Can it be overcome? (3) How does logic work? What does logical form represent? Are logical constants referential? (4) Is there a criterion of logicality? (5) What is the relation between logic and mathematics? | |
The paper distinguishes between two kinds of mathematics, natural mathematics which is a result of biological evolution and artificial mathematics which is a result of cultural evolution. On this basis, it outlines an approach to the philosophy of mathematics which involves a new treatment of the method of mathematics, the notion of demonstration, the questions of discovery and justification, the nature of mathematical objects, the character of mathematical definition, the role of intuition, the role of diagrams in mathematics, and the (...) effectiveness of mathematics in natural science. (shrink) | |
Conceptual primitivism is the view that truth is among our most basic and fundamental concepts. It cannot be defined, analyzed, or reduced into concepts that are more fundamental. Primitivism is opposed to both traditional attempts at defining truth (in terms of correspondence, coherence, or utility) and deflationary theories that argue that the notion of truth is exhausted by means of the truth schema. Though primitivism might be thought of as a view of last resort, I believe that the view is (...) independently attractive, and can be argued for directly. In this paper I offer what I take to be the strongest argument in favor of conceptual primitivism, which relies upon the Fregean doctrine of the omnipresence of truth. (shrink) | |
When the original Dutch version of this book was presented in 1971 to the University of Leiden as a thesis for the Doctorate in philosophy, I was prevented by the academic mores of that university from expressing my sincere thanks to three members of the Philosophical Faculty for their support of and interest in my pursuits. I take the liberty of doing so now, two and a half years later. First and foremost I want to thank Professor G. Nuchelmans warmly (...) for his expert guidance of my research. A number of my most im portant sources were brought to my attention by him. During the whole process of composing this book his criticism and encouragement were carried out in a truly academic spirit. He thereby provided working conditions that are a sine qua non for every author who is attempting to approach controversial matters in a scientific manner, conditions which, however, were not easily available at that time. In a later phase I also came into contact with Professors L. M. de Rijk and J. B. Ubbink, with both of whom I had highly stimulating discussions and exchanges of ideas. The present edition contains some entirely new sections, viz. 1-9, IV-29, V-9, V-20, VII-14 (iii), (iv), VII-17 (i), VIII-22, IX-17, IX-19, X-9 and XI-8. Section X-9 was inspired by a remark made by Professor A. (shrink) | |
It is well known that the circumflex notation used by Russell and Whitehead to form complex function names in Principia Mathematica played a role in inspiring Alonzo Church's “lambda calculus” for functional logic developed in the 1920s and 1930s. Interestingly, earlier unpublished manuscripts written by Russell between 1903–1905—surely unknown to Church—contain a more extensive anticipation of the essential details of the lambda calculus. Russell also anticipated Schönfinkel's combinatory logic approach of treating multiargument functions as functions having other functions as value. (...) Russell’s work in this regard seems to have been largely inspired by Frege’s theory of functions and “value-ranges”. This system was discarded by Russell due to his abandonment of propositional functions as genuine entities as part of a new tack for solving Russell’s paradox. In this article, I explore the genesis and demise of Russell’s early anticipation of the lambda calculus. (shrink) | |
In the past century the received view of definition in mathematics has been the stipulative conception, according to which a definition merely stipulates the meaning of a term in other terms which are supposed to be already well known. The stipulative conception has been so absolutely dominant and accepted as unproblematic that the nature of definition has not been much discussed, yet it is inadequate. This paper examines its shortcomings and proposes an alternative, the heuristic conception. | |
This paper brings to light a new puzzle for Frege interpretation, and offers a solution to that puzzle. The puzzle concerns Frege’s judgement-stroke (‘|’), and consists in a tension between three of Frege’s claims. First, Frege vehemently maintains that psychological considerations should have no place in logic. Second, Frege regards the judgementstroke—and the associated dissociation of assertoric force from content, of the act of judgement from the subject matter about which judgement is made—as a crucial part of his logic. Third, (...) Frege holds that judging is an inner mental process, and that the distinction marked by the judgement-stroke, between entertaining a thought and judging that it is true, is a psychological distinction. I argue that what initially looks like confusion here on Frege’s part appears quite reasonable when we remind ourselves of the differences between Frege’s conception of logic and our own. (shrink) | |
The translation of both ‘bedeuten’ and ‘Bedeutung’ in Frege's works remains sufficiently problematic that some contemporary authors prefer to leave these words untranslated. Here a case is made for returning to Russell's initial choice of ‘to indicate’ and ‘indication’ as better alternatives than the more usual ‘meaning’, ‘reference’, or ‘denotation’. It is argued that this choice has the philosophical payoff that Frege's controversial doctrines concerning the semantic values of sentences and predicative expressions are rendered far more comprehensible by it, and (...) that this translational strategy fulfills the desiderata of offering a translation which is acceptable both before and after Frege introduced the distinction between sense and reference or, as this paper would have it, between the sense of an expression and what it indicates. (shrink) | |
According to the standard view of particularity, an entity is a particular just in case it necessarily has a unique spatial location at any time of its existence. That the basic entities of the world we speak about in common sense and science are particular entities in this sense is the thesis of “foundational particularism,” a theoretical intuition that has guided Western ontological research from its beginnings to the present day. The main aim of this paper is to review the (...) notion of particularity and its role in ontology. I proceed in four steps. First, I offer a brief reconstruction of the tasks of ontology as “theory of categorial inference in L”. An ontological theory states which (combinations of) entity types or categories make true L-sentences true; the features of the stipulated categories explain why L-speakers are entitled to draw certain material inferences from the classificatory expressions of L. Second, I draw attention to the fact that since Aristotle this theoretical program typically has been implemented with peculiar restrictions prescribing certain combinations of category features, e.g., the combination of particularity, concreteness, individuality, and subjecthood. I briefly sketch how these restrictions of the “substance paradigm” or “myth of substance” are reinforced by the standard readings of predicate-logical constants, viz. the existential quantifier and the identity sign. Third, I argue that in the context of the substance paradigm foundational particularism is incoherent. I discuss the current standard conceptions of particulars as developed in the debate about individuation (bare particulars, nude particulars, tropes) and show that their main difficulties derive from the traditional restriction that particulars are so also logical subjects and/or individuals. Fourth, to show that the traditional linkages of category features are not conceptual necessities, I sketch the outlines of an ontology (General Process Theory) based on non-particular individuals. For ontologists in computer science working with description logic this monocategoreal ontology based on more or less generic ‘dynamics’ may hold special interest. As General Process Theory documents, ontologists may well abandon the notion of particularity: in common sense and science we do reason about items that have a unique spatial location at any time, but the uniqueness of their location can be taken to be a contingent affair. (shrink) | |
Written by experts in the field, this volume presents a comprehensive investigation into the relationship between argumentation theory and the philosophy of mathematical practice. Argumentation theory studies reasoning and argument, and especially those aspects not addressed, or not addressed well, by formal deduction. The philosophy of mathematical practice diverges from mainstream philosophy of mathematics in the emphasis it places on what the majority of working mathematicians actually do, rather than on mathematical foundations. -/- The book begins by first challenging the (...) assumption that there is no role for informal logic in mathematics. Next, it details the usefulness of argumentation theory in the understanding of mathematical practice, offering an impressively diverse set of examples, covering the history of mathematics, mathematics education and, perhaps surprisingly, formal proof verification. From there, the book demonstrates that mathematics also offers a valuable testbed for argumentation theory. Coverage concludes by defending attention to mathematical argumentation as the basis for new perspectives on the philosophy of mathematics. . (shrink) | |
In an unpublished manuscript of 1914 titled ‘Logic in mathematics’, Gottlob Frege offered a rich account of the paradox of analysis. I argue that Frege there claims that the explicandum and explicans of a successful analysis express the same sense and that he furthermore appreciated that this requires that one cannot conclude that two sentences differ in sense simply because it is possible for a (minimally) competent speaker to accept one without accepting the other. I claim that this is shown (...) by Frege’s suggestive remarks about a cloudy grasp of a sense. I then argue that this fact calls into question a key assumption behind Frege’s master argument for the sense/reference distinction. (shrink) | |
Attention to the conversational role of alethic terms seems to dominate, and even sometimes exhaust, many contemporary analyses of the nature of truth. Yet, because truth plays a role in judgment and assertion regardless of whether alethic terms are expressly used, such analyses cannot be comprehensive or fully adequate. A more general analysis of the nature of truth is therefore required – one which continues to explain the significance of truth independently of the role alethic terms play in discourse. We (...) undertake such an analysis in this paper; in particular, we start with certain elements from Kant and Frege, and develop a construct of truth as a normative modality of cognitive acts (e.g., thought, judgment, assertion). Using the various biconditional T-schemas to sanction the general passage from assertions to (equivalent) assertions of truth, we then suggest that an illocutionary analysis of truth can contribute to its locutionary analysis as well, including the analysis of diverse constructions involving alethic terms that have been largely overlooked in the philosophical literature. Finally, we briefly indicate the importance of distinguishing between alethic and epistemic modalities. (shrink) | |
This paper proposes a formalization of the class of sentences quantified by most, which is also interpreted as proportion of or majority of depending on the domain of discourse. We consider sentences of the form “Most A are B”, where A and B are plural nouns and the interpretations of A and B are infinite subsets of \. There are two widely used semantics for Most A are B: \ > C \) and \ > \dfrac{C}{2} \), where C denotes (...) the cardinality of a given finite set X. Although is more descriptive than, it also produces a considerable amount of insensitivity for certain sets. Since the quantifier most has a solid cardinal behaviour under the interpretation majority and has a slightly more statistical behaviour under the interpretation proportional of, we consider an alternative approach in deciding quantity-related statements regarding infinite sets. For this we introduce a new semantics using natural density for sentences in which interpretations of their nouns are infinite subsets of \, along with a list of the axiomatization of the concept of natural density. In other words, we take the standard definition of the semantics of most but define it as applying to finite approximations of infinite sets computed to the limit. (shrink) | |
This paper presents a new interpretation of Frege's context principle on which it applies primarily to singular terms for abstract objects but not necessarily to singular terms for ordinary objects. | |
More or less explicitly inspired by the Aristotelian classification of arguments, a wide tradition makes a sharp distinction between argument and proof. Ch. Perelman and R. Johnson, among others, share this view based on the principle that the conclusion of an argument is uncertain while the conclusion of a proof is certain. Producing proof is certainly a major part of mathematical activity. Yet, in practice, mathematicians, expert or beginner, argue about mathematical proofs. This happens during the search for a proof, (...) then when the proof is presented and discussed by experts, and finally when it is taught or used in didactical contexts. (shrink) | |
Formalizing categorical propositions of traditional logic in the language of quantifiers and propositional functions is no straightforward matter, especially when modalities get involved. Starting... | |
This paper discusses the problems of an ontological value of the variable in Russell’s philosophy. The variable is essential in Russell’s theory of denotation, which among other things, purports to prove Meinongian being outside of subsistence and existence to be logically unnecessary. I argue that neither Russell’s epistemology nor his ontology can account for the ontological value of the variable without running into qualities of Meinongian being that Russell disputed. The problem is that the variable cannot be logically grounded by (...) Russell’s theory of denotation. As such, in so far as being is concerned, Meinong and Russell’s theories are much closer than is typically thought. The arguments are supported with concerns raised by Russell, Frege, and Moore regarding the ontological value of the variable. The problem can be summarised as follows: the variable is the fundamental denoting-position of a formal theory that is meant to explain the structure of the ontological. If such a formal theory is meant to ground the ontological, then the formal must also represent the actual structure of the ontological. Yet the variable, the fundamental symbol of denotation in a theory that defines objects, is ontologically indefinable. (shrink) No categories | |
As usually understood, ‘conceptual engineering’ is a form of conceptual inquiry aimed at diagnosing problems with extant concepts and finding better concepts to replace them. This can seem like an appropriate response to a skeptical concern that our concepts are cognitively deficient: unsuitable for use in serious inquiry. We argue, however, that conceptual engineering, so understood, cannot reasonably be motivated in this way. The basic problem is that on the first hand, since conceptual engineering is itself a form of inquiry, (...) it cannot succeed by using the problematic concept itself in inquiry (since it is unsuitable for use in inquiry); but, on the other hand, methods for carrying out inquiry directed at concepts without using those concepts are constrained in such a way as to make conceptual engineering very unlikely to succeed. The upshot is that conceptual engineering has no reasonable chance of addressing the skeptical concern about cognitive deficiency. This is an important and previously unarticulated result, about what conceptual engineering can and cannot reasonably be expected to do. (shrink) | |
This volume aims to establish the starting point for the development, evaluation and appraisal of the phenomenology of mathematics. | |
According to a view going back to Plato, the aim of philosophy is to acquire knowledge and there is a method to acquire knowledge, namely a method of discovery. In the last century, however, this view has been completely abandoned, the attempt to give a rational account of discovery has been given up, and logic has been disconnected from discovery. This paper outlines a way of reconnecting logic with discovery. |