Movatterモバイル変換


[0]ホーム

URL:


SEP home page
Stanford Encyclopedia of Philosophy

Logical Form

First published Tue Oct 19, 1999; substantive revision Wed Sep 1, 2021

Some inferences are impeccable. Examples like (1–3) illustratereasoning that cannot lead from true premises to falseconclusions.

(1)
John danced if Mary sang, and Mary sang; so Johndanced.
(2)
Every politician is deceitful, and every senator isa politician; so every senator is deceitful.
(3)
The detective is in the garden; so someone is inthe garden.

In such cases, a thinker takes no epistemic risk by endorsing theconditional claim that the conclusion is trueif thepremises are true. The conclusion follows from the premises, withoutany further assumptions that might turn out to be false. Any risk oferror lies with the premises, as opposed to the reasoning. Bycontrast, examples like (4–6) illustrate reasoning that involvesat least some risk of going wrong—from correct premises to amistaken conclusion.

(4)
John danced if Mary sang, and John danced; so Marysang.
(5)
Every feathered biped is a bird, and Tweety is afeathered biped; so Tweety can fly.
(6)
Every human born before 1879 died; so every humanwill die.

Inference (4) is not secure. John might dance whenever Mary sings, butalso sometimes when Mary doesn’t sing. Similarly, with regard to(5), Tweety might be a bird that cannot fly. Even (6) falls short ofthe demonstrative character exhibited by (1–3). While laws ofnature may preclude immortality, the conclusion of (6) goes beyond itspremise, even if it is foolish to resist the inference.

Appeals to logical form arose in the context of attempts to say moreabout this intuitive distinction between impeccable inferences, whichinvite metaphors of security, and inferences that involve some risk ofslipping from truth to falsity. The idea is that some inferences, like(1–3), arestructured in a way that confines any riskof error to the premises. The motivations for developing this ideawere both practical and theoretical. Experience teaches us that aninference can initially seem more secure than it is; and if we knewwhichforms of inference are risk-free, that might help usavoid errors. As we’ll see, claims about inference are alsointimately connected with claims about the nature of thought and itsrelation to language.

Many philosophers have been especially interested in the possibilitythat grammarmasks the underlying structure of thought,perhaps in ways that invite mistaken views about how ordinary languageis related to cognition and the world we talk about. For example,similarities across sentences like ‘Homer talked’,‘Nobody talked’, and ‘The nymph talked’initially suggest that the corresponding thoughts exhibit a commonsubject-predicate form. But even if ‘Homer’ indicates anentity that can be the subject of a thought that is true if and onlyif the entity in question talked, ‘Nobody’ does not; andas we’ll see, ‘The’ is complicated. Philosophers andlinguists have also asked general questions about how logic is relatedto grammar. Do thoughts and sentences exhibit differentkindsof structure? Do sentences exhibitgrammatical structuresthat are not obvious? And if the logical structure of a thought candiverge from the grammatical structure of a sentence that is used toexpress the thought, how should we construe proposals about thelogical forms of inferences like (1)–(6)? Are such proposalsnormative claims about how we ought to think/talk, or empiricalhypotheses about aspects of psychological/linguistic reality?

Proposed answers to these questions are usually interwoven with claimsabout why various inferences seem compelling. So it would be nice toknow which inferences really are secure, and in virtue of what theseinferences are special. The most common suggestion has been thatcertain inferences are secure by virtue of their logical form. Thoughunsurprisingly, conceptions of form have evolved along withconceptions of logic and language.

1. Patterns of Reason

One ancient idea is that impeccable inferences exhibit patterns thatcan be characterized schematically by abstracting away from thespecific contents of particular premises and conclusions, therebyrevealing a general form common to many other impeccable inferences.Such forms, along with the inferences that exemplify them, are said tobe valid.

Given a valid inference, there is a sense in which the premisescontain the conclusion, which is correspondingly extractable from thepremises. With regard to (1) and (7), it seems especially clear thatthe conclusion is part of the first premise, and that the secondpremise is another part of the first.

(1)
John danced if Mary sang, and Mary sang; so Johndanced.
(7)
Chris swam if Pat was asleep, and Pat was asleep;so Chris swam.

We can express this point by saying that these inferences areinstances of the following form:B ifA, andA; soB. TheStoics discussed several patterns of this kind, using ordinal numbers(instead of letters) to capture abstract forms like the ones shownbelow.

Ifthe first thenthe second, andthe first; sothe second.

Ifthe first thenthe second, butnotthe second; so notthefirst.

Eitherthe first orthe second, butnotthe second; sothe first.

Not boththe first andthe second,butthe first; so notthesecond.

These schematic formulations employ variables, indicated in bold.Following a long tradition, let’s use the word‘proposition’ as a term of art for whatever thesevariables range over.Propositions are potential premises/conclusions, which can be endorsed orrejected. So they are, presumably, things that can be evaluated fortruth or falsity. This leaves room for various proposals about whatpropositions are: sentences, statements, states of affairs, etc. Butlet’s assume that declarative sentences can be used to expresspropositions; see, e.g., Cartwright (1962) and the essay onstructured propositions.

A significant complication is that in ordinary conversation, thecontext matters with regard to which proposition is expressed with agiven sentence. For example, ‘Pat is asleep’ can be usedat one time to express a true premise, and at another time to expressa false premise. A certain speaker might use ‘I am tired’to express a false proposition, while another speaker uses the samesentence at the same time to express a true proposition. What countsas being tired can also vary across conversations. Contextsensitivity, of various kinds, is ubiquitous in typical discourse.Moreover, even given a context, a sentence like ‘He isbald’ may not express a unique proposition. (There may be noreferent for the pronoun; and even if there is, thevagueness of ‘bald’ may yield a range of candidate propositions,with no fact of the matter as to which one isthe propositionexpressed.) Still, we can and often do use sentences like ‘Everycircle is an ellipse’ and ‘Thirteen is a primenumber’ to express premises of valid arguments. To be sure,mathematical examples are special cases. But the distinction betweenimpeccable and risky inferences is not limited to atypical contexts inwhich we try to think especially clearly about especially abstractmatters. So when focusing on the phenomenon of valid inference, we cantry to simplify the initial discussion by abstracting away from thecontext sensitivity of language use.

Another complication is that in speaking of an inference, one might betalking about (i) a process in which a thinker draws a conclusion fromsome premises, or (ii) some propositions, one of which is designatedas an alleged consequence of the others; see, e.g., Harman (1973). Butwe can describe a risky thought process as one in which a thinker whoaccepts certain propositions—perhaps tentatively orhypothetically—comes to accept, on that basis, a propositionthat does not follow from the initial premises. And it will be simplerto focus on premises/conclusions, as opposed to episodes ofreasoning.

With regard to (1), the inference seems secure in partbecause its first premise has the form‘B ifA’.

(1)
John danced if Mary sang, and Mary sang; so Johndanced.

If the first premise didn’t have this form, the inferencewouldn’t be an instance of ‘B ifA, andA; soB’. It isn’t obvious thatallimpeccable inferences are instances of a more general valid form, muchless inferences whose impeccability is due to the forms of therelevant propositions. But this thought has served as an ideal for thestudy of valid inference, at least since Aristotle’s treatmentof examples like (2).

(2)
Every senator is a politician, and every politicianis deceitful; so every senator is deceitful.

Again, the first premise seems to have several parts, each of which isa part of the second premise or the conclusion. (In English, theindefinite article in ‘Every senator is a politician’cannot be omitted; likewise for ‘Every politician is aliar’. But at least for now, let’s assume that in exampleslike these, ‘a’ does not itself indicate a propositionalconstituent.)Aristotle, predating the Stoics, noted that conditional claims like thefollowing are sure to be true: if (the property of) being a politicianbelongs to every senator, and being deceitful belongs to everypolitician, then being deceitful belongs to every senator.Correspondingly, the inference pattern below is valid.

EveryS isP, and everyP isD; soeveryS isD.

And inference (2) seems to be valid because its parts exhibit thispattern. Aristotle discussed many such forms of inference, calledsyllogisms, involving propositions that can be expressed withquantificational words like ‘every’ and‘some’. For example, the syllogistic patterns below arealso valid.

EveryS isP, and someS isD; sosomeP isD.

SomeS isP, and everyP isD; sosome S isD.

SomeS is notP, everyD isP; sosomeS is notD.

We can rewrite the last two, so that each of the valid syllogismsabove is represented as having a first premise of the form‘EveryS isP’.

EveryS isP, and someD isS; sosomeD isP.

EveryS isP, and someD is notP;so someD is notS.

But however the inferences are represented, the important point isthat the variables—represented here in italics—range overcertainparts of propositions. Intuitively, common nouns like‘politician’ and adjectives like ‘deceitful’are general terms, since they can apply to more than one individual.And many propositions apparently contain correspondingly generalelements. For example, the proposition that every senator is wealthycontains two such elements, both relevant to the validity ofinferences involving this proposition.

Propositions thus seem to have structure that bears on the validity ofinferences, even ignoring premises/conclusions with propositionalparts. In this sense, even atomic propositions have logical form. Andas Aristotle noted, pairs of such propositions can be related ininteresting ways. If everyS isP, then someS isP. (For these purposes, assume there is atleast oneS.) If noS isP, then someS is notP. It is certain that either everyS isP or someS is notP; andwhichever of these propositions is true, the other is false.Similarly, the following propositions cannot both be true: everyS isP; and noS isP. But itisn’t certain that either everyS isP, or noS isP. Perhaps someS isP, andsomeS is notP. This network of logical relationsstrongly suggests that the propositions in question contain aquantificational element and two general elements—and in somecases, an element of negation; seelogic: classical. This raises the question of whether other propositions have a similarstructure.

2. Propositions and Traditional Grammar

Consider the proposition that Vega is a star, which can figure ininferences like (8).

(8)
Every star is purple, and Vega is a star; so Vegais purple.

Aristotle’s logic focused on quantificational propositions; andas we shall see, this was prescient. But on his view, propositionslike the conclusion of (8) still exemplify a subject-predicatestructure that is shared by at least many of the sentences we used toexpress propositions. And one can easily formulate the schema‘everyS isP, andn isS;son isP’, where the new lower-case variableis intended to range over proposition-parts of the sort indicated bynames. (On some views, discussed below, a name like ‘Vega’is a complex quantificational expression; though unsurprisingly, suchviews are tendentious.)

Typically, a declarative sentence can be divided into a subject and apredicate: ‘Every star / is purple’, ‘Vega / is astar’, ‘Some politician / lied’, ‘Thebrightest planet / is visible tonight’, etc. Until quiterecently, it was widely held that this grammatical division reflects acorresponding kind of logical structure: the subject of a proposition(i.e., what the proposition is about) is a target for predication. Onthis view, both ‘Every star’ and ‘Vega’indicate subjects of propositions in (8), while ‘is’introduces predicates. Aristotle would have said that in the premisesof (8), being purple is predicated of every star, and being a star ispredicated of Vega. Later theorists emphasized the contrast betweengeneral terms like ‘star’ and singular terms like‘Vega’, while also distinguishing terms fromsyncategorematic expressions (e.g., ‘every’ and‘is’) that can combine with terms to form complex subjectsand predicates, including ‘will lie’, ‘canlie’, and ‘may have lied’. But despite thecomplications, it seemed clear that many propositions have thefollowing canonical form: Subject-copula-Predicate; where a copulalinks a subject, which may consist of a quantifier and a general term,to a general term. Sentences like ‘Every star twinkles’can be paraphrased with sentences like ‘Every star is atwinkling thing’. This invites the suggestion that‘twinkles’ is somehow an abbreviation for ‘is atwinkling thing’.

The proposition that not only Vega twinkles, which seems to containthe proposition that Vega twinkles, presumably includes elements thatare indicated with ‘only’ and ‘not’. Suchexamples invite the hypothesis that all propositions are composed ofterms along with a relatively small number of syncategorematicelements, and that complex propositions can be reduced to canonicalpropositions that are governed by Aristotelian logic. This is not tosay that all propositions were, or could be, successfully analyzed inthis manner. But via this strategy, medieval logicians were able todescribe many impeccable infereces as instances of valid forms. Andthis informed their discussions of how logic is related togrammar.

Many viewed their project as an attempt to uncover principles of amental language common to all thinkers. Aristotle had said, similarly,that spoken sounds symbolize “affections of the soul.”From this perspective, one expects to find some differences betweenpropositions and overt sentences. If ‘Every star twinkles’expresses a proposition that contains a copula, then spoken languagesmask certain aspects of logical structure.William of Ockham held that a mental language would have no need for Latin’sdeclensions, and that logicians could ignore such aspects of spokenlanguage. The ancient Greeks were aware of sophisms like thefollowing: that dog is a father, and that dog is yours; so that dog isyour father. This bad inference cannot share its form with thesuperficially parallel but impeccable variant: that dog is a mutt, andthat mutt is yours; so that dog is your mutt. (See Plato, Euthydemus298 d-e.) So the superficial features of sentences are not infallibleguides to the logical forms of propositions. Still, the divergence washeld to be relatively minor. Spoken sentences have structure; they arecomposed, in systematic ways, of words. And the assumption was thatspoken sentences reflect the major aspects of propositional form,including a subject-predicate division. So while there is adistinction between the study of valid inference and the study ofsentences used in spoken language, the connection between logic andgrammar was thought to run deep. This suggested that the logical formof a proposition just is the grammatical form of some (perhaps mental)sentence.

3. Motivations for Revision

Towards the end of the eighteenth century, Kant could say (withoutmuch exaggeration) that logic had followed a single path since itsinception, and that “since Aristotle it has not had to retrace asingle step.” He also said that syllogistic logic was “toall appearance complete and perfect.” But this was exuberance.The successes also highlighted problems that had been recognized.

Some valid schemata are reducible to others, in that any inference ofthe reducible form can be revealed as valid (with a little work) givenother schemata. Consider (9).

(9)
If Al ran then either Al did not run or Bob did notswim, and Al ran; so Bob did not swim.

Assume that ‘Al did not run’ negates ‘Al ran’,while ‘Bob did not swim’ negates ‘Bob swam’.Then (9) is an instance of the following valid form: ifA then either not-A ornot-B, andA; sonot-B. But we can treat this as a derived form, byshowing that any instance of this form is valid given two (intuitivelymore basic) Stoic inference forms: ifthe firstthenthe second, and the first, sothe second; either not the firstornotthe second, andthe first; sonotthe second. For suppose we are given thefollowing premises:A; and ifA,then either not-A or not-B. We cansafely infer that either not-A ornot-B; and since we were givenA, wecan safely infer not-B. Similarly, the syllogisticschema (10) can be treated as a derived form.

(10)
SomeS is notP, and everyD isP; so not everyS isD.

If someS is notP, and everyD isP, then it isn’t true that everyS isD. For if everyS isD, and everyD isP, then every S isP. ButifsomeS is notP, then as we saw above,not everyS isP. So given the premises of (10),adding ‘every S isD’ would lead tocontradiction: every S isP, and not everyS isP. So the premises imply thenegation of‘every S isD’. This reasoning shows how(10) can be reduced to inferential patterns that seem morebasic—raising the question of how much reduction is possible.Euclid’s geometry had provided a model for how to present a bodyof knowledge as a network of propositions that follow from a few basicaxioms. Aristotle himself indicated how to reduce all the validsyllogistic schemata to four basic patterns, given a few principlesthat govern how the basic patterns can be used to derive others; seeParsons (2014) for discussion. And further reduction is possible giveninsights from the medieval period.

Consider the following pair of valid inferences: Fido is a brown dog,so Fido is a dog; Fido is not a dog, so Fido is not a brown dog. Asillustrated with the first example, replacing a predicate (or generalterm) like ‘brown dog’ with aless restrictivepredicate like ‘dog’ is often valid. Butsometimes—paradigmatically, in cases involvingnegation—replacing a predicate like ‘dog’ with amore restrictive predicate like ‘brown dog’ isvalid. Plausibly, the first pattern reflects the default direction ofvalid replacement: removing a restriction preserves truth, except inspecial cases like those involving negation. Suppose we take it asgiven that poodles are dogs of a particular sort, and hence that everypoodle is a dog. Then replacing‘poodle’ with‘dog’ in ‘Fido isP’ is valid,regardless of what ‘Fido’ names. This can be viewed as aspecial case of ‘n isP, and everyPisD; son isD’. But the validity ofthis inference form can also be viewed as symptom of a basic principlethat came be calleddictum de omni: whatever is true of everyP is true of anyP. Or as Aristotle might have putit, if the property of being a dog belongs to every poodle, then itbelongs to any poodle. In which case, Fido is a dog if Fido is apoodle. And since the property of being a dog surely belongs to everybrown dog, any brown dog is a dog. The flip side of this point is thatnegation inverts the default direction of inference. Anything thatisn’t a dog isn’t a brown dog; and similarly, if Fidoisn’t a dog, then Fido isn’t a poodle. So in specialcases, adding a restriction to a general term like ‘dog’can preserve truth.

From this perspective, the Aristotelian quantifier ‘Some’is a default-style quantifier that validatesremovingrestrictions. If some brown dog is a clever mutt, it follows that somedog is a clever mutt, and hence that some dog is a mutt. By contrast,‘No’ is an inverted-style quantifier that validatesaddingrestrictions. If no dog is a mutt, it follows that nodog is a clever mutt, and hence that no brown dog is a clever mutt.The corresponding principle,dictum de nullo, encodes thispattern: whatever is true of noP is not true of anyP; so if the property of being a mutt belongs to no dog, itbelongs to no poodle. (And as Aristotle noted, instances of ‘NoS isP’ can be analyzed as the propositionalnegations of corresponding instances of ‘SomeSisn’tP’.)

Interestingly, ‘Every’ is like ‘No’ in onerespect, and like ‘Some’ in another respect. If every dogis clever, it follows that every brown dog is clever; but if every dogis a clever mutt, it follows that every dog is a mutt. So when theuniversal quantifier combines with a general termS to form asubject,S is governed by theinverted rule ofreplacement. But when a universally quantified subject combines with asecond general term to form a proposition, this second term isgoverned by thedefaultrule of replacement. Given that‘Every’ has this mixed logical character, the validsyllogisms can be derived from two basic patterns (noted above), bothof which reflectdictum de omni: whatever is true of everyP is true of anyP.

EveryS isP, and everyP isD; soeveryS isD.

EveryS isP, and someD isS; sosomeD isP.

The first principle reflects the sense in which universalquantification is transitive. The second principle captures the ideathat the universal premise can license replacement of‘S’ with ‘P’ in aproposition about some individual. In this sense, classical logicexhibits a striking unity and simplicity, at least with regard toinferences involving the Aristotelian quantifiers and predication. Forfurther discussion, see Sommers (1984), van Bentham (1986), Sanchez(1991, 1994), and Ludlow (2005).

Alas, matters become more complicated once we consider relations.

Sentences like ‘Juliet kissed Romeo’ do not seem to haveSubject-copula-Predicate form. One might suggest ‘Juliet was akisser of Romeo’ as a paraphrase. But ‘kisser ofRomeo’ differs, in ways that matter to inference, from generalterms like ‘politician’. If Juliet (or anyone) was akisser of Romeo, it follows that someone was kissed; whereas if Julietwas a politician, there is no corresponding logical consequence to theeffect that someone was __-ed. Put another way, the proposition thatJuliet kissed someone exhibits interesting logical structure, even ifwe can express this proposition via the sentence ‘Juliet was akisser of someone’. A quantifier can be part of a complexpredicate. But classical logic did not capture the validity ofinferences involving predicates that have quantificationalconstituents. Consider (11).

(11)
Some patient respects every doctor, and some doctoris a liar; so some patient respects some liar.

If ‘respects every doctor’ and ‘respects someliar’ indicate nonrelational proposition-parts, much like‘is sick’ or ‘is happy’, then inference (11)has the following form ‘SomeP isS, and someD isL; so someP isH’. Butthis schema, which fails to reflect the quantificational structurewithin the predicates is not valid. Its instances include badinferences like the following: some patient is sick, and some doctoris a liar; so some patient is happy. This dramatizes the point that‘respects every doctor’ and ‘respects someliar’ are—unlike ‘is sick’ and ‘istall’—logically related in a way that matters given thesecond premise of (11).

One can adopt the view that many propositions have relational parts,introducing a variable ‘R’ intended to range overrelations; see the entries onmedieval relations, andmedieval terms. One can also formulate the following schema: somePReveryD, and someD isL; so somePR someL. But the problem remains.Quantifiers can appear in complex predicates that figure in validinferences like (12).

(12)
Every patient who respects every doctor is sick,and
some patient who saw every lawyer respects every doctor; so
some patient who saw every lawyer is sick.

But if ‘patient who respects every doctor’ and‘patient who saw every lawyer’ are nonrelational, muchlike ‘old patient’ or ‘young patient’, then(12) has the following form: everyO isS, and someYR everyD; so someY isS. And many inferences of this form are invalid. For example:every otter is sick, and some yak respects every doctor; so some yakis sick. Again, one can abstract a valid schema that covers (12),letting parentheses indicate a relative clause that restricts theadjacent predicate.

EveryP(R1 everyD) isS, and someP(R2 everyL)R1 everyD; sosomeP(R2 everyL) isS.

But no matter how complex the schema, the relevant predicates canexhibit further quantificational structure. (Consider the propositionthat every patient who met some doctor who saw no lawyerrespects some lawyer who saw no patient who met everydoctor.) Moreover, schemata like the one above are poorcandidates for basic inference patterns.

As medieval logicians knew, propositions expressed with relativeclauses also pose other difficulties; see the entry onmedieval syllogism. If every doctor is healthy, it follows that every young doctor ishealthy. By itself, this is expected, since a universally quantifiedsubject licenses replacement of ‘doctor’ with the morerestrictive predicate ‘young doctor’. But consider (13)and (14).

(13)
No patient who saw every young doctor ishealthy.
(14)
No patient who saw every doctor is healthy.

Here, the direction of valid inference is from ‘youngdoctor’ to ‘doctor’, as if the inference is governedby the (default) inference rule that licenses replacement of‘young doctor’ with the less restrictive predicate‘doctor’. One can say that the default direction ofimplication, from more restrictive to less restrictive predicates, hasbeen inverted twice—once by ‘No’, and once by‘every’. But one wants a systematic account ofpropositional structure that explains the net effect; see Ludlow(2002) for further discussion. Sommers (1982) offers a strategy forrecoding and extending classical logic, in part by exploiting an ideasuggested byLeibniz (and arguably Pāṇini): a relational sentence like‘Juliet loved Romeo’ somehow combines an active-voicesentence with a passive-voice sentence, perhaps along the lines of‘Juliet loved,and thereby Romeo was loved’; cp.section nine. But one way or another, quantifiers need to be characterized in a waythat captures their general logical role—and not just their roleas potential subjects of Aristotelian propositions—ifimpeccability is to be revealed as a matter of form. Quantifiers arenot simply devices for creating schemata like ‘EverySisP’, into which general terms like‘politician’ and ‘deceitful’ can be inserted.Instances of ‘S’ and ‘P’ canthemselves have quantificational structure and relationalconstituents.

4. Frege and Formal Language

Gottlob Frege showed how to resolve these difficulties for classical logic in onefell swoop. His system of logic, published in 1879 and still in use(with notational modifications), was arguably the single greatestcontribution to the subject. So it is significant that onFrege’s view, propositions do not have subject-predicate form.Indeed, his leading idea was that propositions have“function-argument” structure. Frege thereby drew asubstantial distinction between logical form and grammatical form astraditionally conceived. This had a major impact on subsequentdiscussions of thought and its relation to language. Though beforeturning to details, it is worth taking a slight detour to note thatFrege did not think of functions as abstractobjects likenumbers.

Every function maps each entity in some domain onto exactly one entityin some range. But while every function thus determines a set ofordered pairs, Frege (1891) did not identify functions with such sets.He said that a function “by itself must be called incomplete, inneed of supplementation, or unsaturated. And in this respect functionsdiffer fundamentally from numbers (p. 133).” For example, we canrepresent the successor function as follows, with the natural numbersas the relevant domain for the variable ‘\(x\)’: \(S(x) = x + 1\).This function maps zero onto one, one onto two, and so on. So we canspecify the set \(\{\langle x, y \rangle : y = x + 1\}\) as the“value-range” of the successor function. But according toFrege, any particular argument (e.g., the number one) “goestogether with the function to make up a complete whole” (e.g.,the number two); and a number does not go together with a set to formaunit in this fashion. Frege granted that the word‘function’ is often used to talk about the sets he wouldcall value-ranges. But he maintained that the notion of an“unsaturated” function, which may be applied to endlesslymany arguments, is logically prior to any notion of a set withendlessly many elements that are specified functionally; see p.135,note E. While the second positive integer is the successor of thefirst, the number two is still a single thing, distinct from anycombination of a number with a set. Frege was influenced byKant’s discussion of judgment, the kind of unity that a(structured) proposition exhibits, and the ancient observation that merely combining twothings (e.g., Socrates and the property of being mortal) does not makethe combination true or false. If it helps, think about‘\(S(x)\)’—or better, ‘\(S(\ )\)’—as theunsaturated result of abstracting away from the numerical argument ina complex denoting expression like ‘\(S(1)\)’ or‘\(S(62)\)’; and think about saturating ‘\(S(\ )\)’with a numerical argument, like ‘1’ or ‘62’,as a process of “de-abstraction.” So in saying thatpropositions have “function-argument” structure, Frege wasnot only rejecting the traditional idea that logical from reflects thesubject-predicate structure of ordinary sentences, he was suggestingthat propositions (and any of their complex constituents) exhibit akind of unity that is like the unity of ‘\(S(1)\)’, which canappear in invented arithmetic sentences like ‘\(S(1) = 2\)’.Church (1941) echoed Frege by distinguishing functions-in-intension,which Church identified with computational procedures, from theirextensions. Perhaps Frege would have said that even procedures, asabstractions of a special kind, are too object-like to be functions inhis special sense. But distinct procedures can determine the sameextension. (Compare adding one to a natural numbern with thefollowing procedure: take the positive square root of the result ofadding one ton squared plusn doubled.) So at leastin this sense, functions-in-intension can be distinguished from theextensions they determine; cp. Chomsky’s (1986) contrast betweenI-languages and E-languages.

For purposes of capturing valid arguments concerning relations, themore important point is that functions need not be unary. For example,arithmetic division can be represented as a function from orderedpairs of numbers onto quotients: \(Q(x, y) = \frac{x}{y}\). Mappings can also beconditional. Consider the function that maps every even integer ontoitself, and every odd integer onto its successor: \(C(x) = x\) if \(x\) iseven, and \(x + 1\) otherwise; \(C(1) = 2\), \(C(2) = 2\), \(C(3) = 4\), etc. Fregeheld that propositions have parts that correspond to functions, and inparticular, conditional functions that map arguments onto specialvalues that reflect the truth or falsity of propositions/sentences.(As discussed below, Frege [1892] also distinguished these“truth values” from what he called Thoughts [Gedanken] orthe “senses” [Sinnen] of propositions; where each of thesesentential senses “presents” a truth value in certainway—i.e., as the value of a certain indicated function given acertain indicated argument.)

Variable letters, such as ‘\(x\)’ and ‘\(y\)’ in‘\(Q(x, y) = \frac{x}{y}\)’, are typographically convenient forrepresenting functions that take more than one argument. But we couldalso index argument places, as shown below.

\[Q[(\ )_i, (\ )_j] = \frac{(\ )_i}{(\ )_j}\]

Or we could replace the subscripts above with lines that connect eachpair of round brackets on the left of ‘\(=\)’ to acorresponding pair of brackets on the right. But the idea, however weencode it, is that a proposition has at least one constituent that issaturated by the requisite number of arguments.

On Frege’s view, the proposition that Mary sang has a functionalcomponent corresponding to ‘sang’ and an argumentcorresponding to ‘Mary’, even if the English sentence‘Mary sang’ has ‘Mary’ as its subject and‘sang’ as its predicate. The proposition can berepresented as follows: \(\textrm{Sang}(\textrm{Mary})\). Frege thought of the relevantfunction as a conditional mapping from individuals to truth values:\(\textrm{Sang}(x) = \textbf{T}\) if \(x\) sang, and \(\textbf{F}\)otherwise; where ‘\(\textbf{T}\)’ and‘\(\textbf{F}\)’ stand for special entities such thatfor each individual \(x\), \(\textrm{Sang}(x) = \textbf{T}\) if and only if \(x\)sang, and \(\textrm{Sang}(x) = \textbf{F}\) if and only if \(x\) did not sing.According to Frege, the proposition that John admires Mary combines anordered pair of arguments with a functional component that correspondsto the transitive verb: \(\textrm{Admires}(\textrm{John}, \textrm{Mary})\); where for any individual\(x\), and any individual \(y\), \(\textrm{Admires}(x, y) = \textbf{T}\) if \(x\)admires \(y\), and \(\textbf{F}\) otherwise. From this perspective,the structure and constituents are the same in the proposition thatMary is admired by John, even though ‘Mary’ is thegrammatical subject of the passive sentence. Likewise, Frege did notdistinguish the proposition that three precedes four from theproposition that four is preceded by three. More importantly,Frege’s treatment of quantified propositions departs radicallyfrom the traditional idea that the grammatical structure of sentencereflects the logical structure of the indicated proposition.

If \(S\) is the function corresponding to ‘sang’, then Marysang iff—i.e., if and only if—\(S(\textrm{Mary}) =\textbf{T}\). Likewise, someone sang iff: \(S\) maps some individualonto \(\textbf{T}\); that is, for some individual \(x\), \(S(x) =\textbf{T}\). Or using a modern variant of Frege’soriginal notation, someone sang iff \(\exists x [S(x)]\). The quantifier‘\(\exists x\)’ is said to bind the variable ‘\(x\)’,which ranges over individual things in a domain of discourse. (Fornow, assume that the domain contains only people.) If every individualin the domain sang, then \(S\) maps every individual onto the truth value\(\textbf{T}\); or using formal notation, \(\forall x [S(x)]\). Aquantifier binds each occurrence of its variable, as in‘\(\exists x [P(x) \land D(x)]\)’,which reflects the logical form of ‘Someone is both a politicianand deceitful’. In this last example, the quantifier combineswith a complex functional component that is formed by conjoining twosimpler ones.

With regard to the proposition that some politician is deceitful,traditional grammar suggests the division ‘Some politician / isdeceitful’, with the noun ‘politician’ combiningwith the quantificational word to form a complex subject. But on aFregean view, grammar masks the logical division between theexistential quantifier and the rest: \(\exists x [P(x) \land D(x)]\). Withregard to the proposition that every politician is deceitful, Fregealso stresses the logical division between the quantifier and itsscope: \(\forall x [P(x) \rightarrow D(x)]\); every individual is deceitful if apolitician. Here too, the quantifier combines with a complexfunctional component, albeit one that is conditional rather thanconjunctive. (The formal sentence ‘\(\forall x [P(x) \land D(x)]\)’implies, unconditionally, that every individual is apolitician.) As Frege (1879) defined his analogs of the modern symbolsused here, ‘\(P(x) \rightarrow D(x)\)’ is equivalent to‘\(\lnot P(x) \lor D(x)\)’, and‘\(\forall x\)’ is equivalent to ‘\(\lnot \exists\lnot\)’. So ‘\(\forall x [P(x) \rightarrow D(x)]\)’is equivalent to‘\(\lnot \exists x \lnot[\lnot P(x) \lor D(x)]\)’; and given de Morgan’sLaws (concerning the relations between negation, disjunction, andconjunction), \(\lnot \exists x \lnot [\lnot P(x) \lor D(x)]\) iff\(\lnot \exists x [P(x) \land \lnot D(x)]\). Hence, \(\forall x [P(x)\rightarrow D(x)]\) iff \(\lnot \exists x [P(x) \land \lnotD(x)]\). This captures the idea that every politician is deceitful iffno individual is both a politician and not deceitful.

If this conception of logical form is correct, then grammar ismisleading in several respects. First, grammar leads us to think that‘some politician’ indicates a constituent of theproposition that some politician is deceitful. Second, grammar masks adifference between existential and universally quantifiedpropositions; predicates are related conjunctively in the former, andconditionally in the latter. (Though as discussed insection seven, one can—and Frege [1884] did—adopt a different view thatallows for relational/restricted quantifiers as in‘\(\forall x{:}P(x) [D(x)]\)’.)

More importantly, Frege’s account was designed to apply equallywell to propositions involving relations and multiple quantifiers. Andwith regard to these propositions, there seems to be a big differencebetween logical structure and grammatical structure.

On Frege’s view, a single quantifier can bind an unsaturatedposition that is associated with a function that takes a singleargument. But it is equally true that two quantifiers can bind twounsaturated positions associated with a function that takes a pair ofarguments. For example, the proposition that everyone likes everyonecan be represented with the formal sentence‘\(\forall x \forall y [L(x, y)]\)’. Assuming that‘Romeo’ and ‘Juliet’ indicate arguments, itfollows that Romeo likes everyone, and that everyone likesJuliet—\(\forall y [L(r, y)]\) and \(\forall x [L(x, j)]\). And it followsfrom all three propositions that Romeo likes Juliet: \(L(r, j)\). Therules of inference for Frege’s logic capture this generalfeature of the universal quantifier. A variable bound by a universalquantifier can be replaced with a name for some individual in thedomain. Correlatively, a name can be replaced with a variable bound byan existential quantifier. Given that Romeo likes Juliet, it followsthat someone likes Juliet, and Romeo likes someone. Frege’sformalism can capture this as well: \(L(r, j)\); so \(\exists x [L(x, j)] \land \exists x [L(r, x)]\).And given either conjunct in the conclusion,it follows that someone likes someone: \(\exists x \exists y [L(x, y)]\). Asingle quantifier can also bind multiple argument positions, as in‘\(\exists x [L(x, x)]\)’, which is true iff someone likesherself. Putting these points schematically:\(\forall x (\dots x \dots)\), so \(\dots n \dots\); and\(\dots n \dots\), so \(\exists x (\dots x \dots)\).

Mixed quantification introduces an interesting wrinkle. Thepropositions expressed with ‘\(\exists x \forall y [L(x, y)]\)’and ‘\(\forall y \exists x [L(x, y)]\)’ differ. We can paraphrasethe first as ‘there is someone who likes everyone’ and thesecond as ‘everyone is liked by someone or other’. Thesecond follows from the first, but not vice versa. This suggests that‘someone likes everyone’ is ambiguous, in that this stringof English words can be used to express two different propositions.This in turn raises difficult questions about what natural languageexpressions are, and how they can be used to express propositions; seesection eight. But for Frege, the important point concerned the distinction betweenthe propositions (Gedanken). Similar remarks apply to‘\(\forall x \exists y [L(x, y)]\)’ and‘\(\exists y \forall x [L(x, y)]\)’.

A related phenomenon is exhibited by ‘John danced if Mary sangand Chris slept’. Is the intended proposition of the form‘(A ifB) andC’ or ‘A if(B andC)’? Indeed, it seemsthat the relation between word-strings and propositions expressed isoften one-to-many. Is someone who says ‘The artist drew aclub’ talking about a sketch or a card game? One can use‘is’ to express identity, as in ‘Hesperus is theplanet Venus’; but in ‘Hesperus is bright’,‘is’ indicates predication. In ‘Hesperus is aplanet’, ‘a’ seems to be logically inert; yet in‘John saw a planet’, ‘a’ seems to indicateexistential quantification: \(\exists x [P(x) \land S(j, x)]\). (One canrender ‘Hesperus is a planet’ as ‘\(\exists x [P(x) \land h = x]\)’.But this treats ‘is a planet’ asimportantly different than ‘is bright’; and this leads toother difficulties.) According to Frege, such ambiguities providefurther evidence that natural language is not suited to the task ofrepresenting propositions and inferential relations perspicuously.(Leibniz and others had envisioned a “CharacteristicaUniversalis”, but without detailed proposals for how to proceedbeyond syllogistic logic in creating one.) This is not to deny thatnatural language is well suited for other purposes, perhaps includingefficient human communication. And Frege held that we often do usenatural language to express propositions. But he suggested thatnatural language is like the eye, whereas a good formal language islike a microscope that reveals structure not otherwise observable. Onthis view, the logical form of a proposition is made manifest by thestructure of a sentence in an ideal formal language—what Fregecalled a Begriffsschrift (concept-script); where the sentences of sucha language exhibit function-argument structures that differ in kindfrom the grammatical structures exhibited by the sentences we use inordinary communication.

The real power of Frege’s strategy for representingpropositional structure is most evident in his discussions of proofsby induction, the Dedekind-Peano axioms for arithmetic, and how theproposition that every number has a successor is logically related tomore basic truths of arithmetic; see the entry onFrege’s theorem and foundations for arithmetic. But without getting into these details, one can get a sense ofFrege’s improvement on previous logic by considering(15–16) and Fregean analyses of the correspondingpropositions.

(15)
Every patient respects some doctor
\(\forall x \{P(x) \rightarrow \exists y [D(y) \land R(x,y)]\}\)
(16)
Every old patient respects some doctor
\(\forall x \{[O(x) \land P(x)] \rightarrow \exists y [D(y) \land R(x,y)]\}\)

Suppose that every individual has the following conditional property:if he\(_x\) is a patient, then some individual is suchthat she\(_y\) is both a doctor and respected byhim\(_x\). Then it follows—intuitively and giventhe rules of Frege’s logic—that everyindividual\(_x\) has the following conditional property:if he\(_x\) is both old and a patient, then someindividual\(_y\) is such that she\(_y\) isboth a doctor and respected by him\(_x\). So theproposition expressed with (16) follows from the one expressed with(15). More interestingly, we can also account for why the propositionexpressed with (14) follows from the one expressed with (13).

(13)
No patient who saw every young doctor is healthy
\(\neg \exists x \{P(x) \land \forall y \{[Y(y) \land D(y)] \rightarrow S(x,y)\} \land H(x)\}\)
(14)
No patient who saw every doctor is healthy
\(\neg \exists x \{P(x) \land \forall y [D(y) \rightarrow S(x,y)] \land H(x)\}\)

For suppose it is false that some individual has the followingconjunctive property: he\(x\) is a patient; andhe\(x\) saw every young doctor (i.e., everyindividual\(y\) is such that if she\(y\)is a young doctor, then he\(x\) sawher\(y\)); and he\(x\) is healthy. Thenintuitively, and also given the rules of Frege’s logic, it isfalse that some individual has the following conjunctive property:he\(x\) is a patient; and he\(x\) sawevery doctor; and he\(x\) is healthy. This captures thefact that the direction of valid inference is from ‘every youngdoctor’ in (13) to ‘every doctor’ in (14), despitethe fact that in simpler cases, replacing ‘every doctor’with ‘every young doctor’ is valid. More generally,Frege’s logic handles a wide range of inferences that hadpuzzled medieval logicians. But the Fregean logical forms seem todiffer dramatically from the grammatical forms of sentences like(13–16). Frege concluded that we need a Begriffsschrift,distinct from the languages we naturally speak, to depict (and help usdiscern) the structures of the propositions we can somehow express byusing ordinary sentences in contexts.

Frege also made a different kind of contribution, which would proveimportant, to the study of propositions. In early work, he spoke asthough propositional constituents were the relevant functions and(ordered n-tuples of) entities that such functions map totruth-values. But he later refined this view in light of hisdistinction between Sinn and Bedeutung; see the entry onFrege. The Sinn of an expression was said to be a “way ofpresenting” the corresponding Bedeutung, which might be anentity (with truth-values as special cases ofentities) or a function from (ordered n-tuples of) entities totruth-values. The basic idea is that two names, like‘Hesperus’ and ‘Phosphorus’, can present thesame Bedeutung in different ways; in which case, the Sinn of the firstname differs from the Sinn of the second. Given this distinction, wecan think of ‘Hesperus’ as an expression that presentsVenusas the evening star, while ‘Phosphorus’presents Venusas the morning star. Likewise, we can think of‘is bright’ as an expression that presents a certainfunction in a certain way, and ‘Hesperus is bright’ as asentence that presents its truth-value in a certain way—i.e., asthe value of the function in question given the argument in question.From this perspective, propositions are sentential ways of presentingtruth-values, and proposition-parts are subsentential ways ofpresenting functions and arguments. Frege could thus distinguish theproposition that Hesperus is bright from the proposition thatPhosphorus is bright, even though the two propositions are alike withregard to the relevant function and argument. Likewise, he coulddistinguish the trivial proposition Hesperus is Hesperus from the(apparently nontrivial) proposition Hesperus is Phosphorus. This is anattractive view. For intuitively, the inference ‘Hesperus isHesperus, so Hesperus is Phosphorus’ is not an instance of thefollowing obviously valid schema:A, soA. But this raised questions about what the Sinn ofan expression really is, what “presentation” could amountto, and what to say about a name with no Bedeutung.

5. Descriptions and Analysis

Frege did not distinguish (or at least did not emphasize anydistinction between) names like ‘John’ and descriptionslike ‘the boy’ or ‘the tall boy from Canada’.Initially, both kinds of expression seem to indicate arguments, asopposed to functions. So one might think that the logical form of‘The boy sang’ is simply ‘\(S(b)\)’, where‘\(b\)’ is an unstructured symbol that stands for the boy inquestion (and presents him in a certain way). But this makes theelements of a description logically irrelevant. And this seems wrong.If the tall boy from Canada sang, then some boy from Canada sang.Moreover, ‘the’ impliesuniqueness in a way that‘some’ does not. Of course, one can say ‘The boysang’ without denying that universe contains more than one boy.But likewise, in ordinary conversation, one can say ‘Everythingis in the trunk’ without denying that the universe contains somethings not in the trunk. And intuitively, a speaker who uses‘the’ does imply that the adjacent predicate is satisfiedby exactly one contextually relevant thing.

Bertrand Russell held that these implications reflect the logical form of aproposition expressed (in a given context) with a definitedescription. On his view, ‘The boy sang’ has the followinglogical form:

\[\exists x \{\textrm{Boy}(x) \land \forall y [\textrm{Boy}(y) \rightarrow y = x] \land S(x)\}\]

some individual\(x\) is such thathe\(x\) is a boy, and every (relevant)individual\(y\) is such that if he\(y\) isa boy, then he\(y\) is identical withhim\(x\), and he\(x\) sang. The awkwardmiddle conjunct was Russell’s way of expressing uniqueness withFregean tools; cf.section seven. But rewriting the middle conjunct would not affect Russell’stechnical point, which is that ‘the boy’ does notcorrespond to any constituent of the formalism. This in turn reflectsRussell’s central claim—viz., that while a speaker mayrefer to a certain boy in saying ‘The boy sang’, the boyin question is not a constituent of the proposition indicated.According to Russell, the proposition has the form of an existentialquantification with a bound variable. It doesnot have theform of a function saturated by (an argument that is) the boy referredto. The proposition is general rather than singular. In this respect,‘the boy’ is like ‘some boy’ and ‘everyboy’; though on Russell’s view, not even ‘the’indicates a constituent of the proposition expressed.

This extended Frege’s idea that natural language misleads usabout the structure of the propositions we assert. Russell went on toapply this hypothesis to what became a famous puzzle. Even thoughFrance is currently kingless, ‘The present king of France isbald’ can be used to express a proposition. The sentence is notmeaningless; it has implications. So if the proposition consists of afunction indicated with ‘\(\textrm{Bald}(\ )\)’ and an argumentindicated with ‘The present king of France’, there mustbe an argument that is indicated. But appeal to nonexistentkings is, to say the least, dubious. Russell concluded that ‘Thepresent king of France is bald’ expresses a quantificationalproposition:

\[\exists x \{K(x) \land \forall y [K(y) \rightarrow y = x] \land B(x)\};\]

where \(K(x) = \textbf{T}\) iff \(x\) is a present king ofFrance, and \(B(x) = \textbf{T}\) iff \(x\) is bald. (For presentpurposes, set aside worries about the vagueness of‘bald’.) And as Russell noted, the following contraryreasoning is spurious: every proposition is true or false; so thepresent king of France is bald or not; so there is a king of France,and he is either bald or not. For letP be theproposition that the king of France is bald. Russell held thatP is indeed true or false. On his view, it is false.Given that \(\neg \exists x [K(x)]\), it follows that

\[\neg \exists x \{K(x) \land \forall y [K(y) \rightarrow y = x] \land B(x)\}.\]

But it does not followthat there is a present king of France who is either bald or not.Given that \(\neg \exists x [K(x)]\), it hardly follows that

\[\exists x \{K(x) \land [B(x) \lor \neg B(x)]\}.\]

So we must not confuse the negation ofP with the following false proposition:

\[\exists x \{K(x) \land \forall y [K(y) \rightarrow y = x] \land \neg B(x)\}.\]

The ambiguity of natural language may foster such confusion, givenexamples like ‘The present king of France is bald ornot’. But according to Russell, puzzles about“nonexistence” can be resolved without specialmetaphysical theses, given the right views about logical form andnatural language.

This invited the thought that other philosophical puzzles mightdissolve if we properly understood the logical forms of ourclaims.Ludwig Wittgenstein argued, in his influentialTractatus Logico-Philosophicus,that: (i) the very possibility of meaningful sentences, which can betrue or false depending on how the world is, requires propositionswith structures of the sort that Frege and Russell were getting at;(ii) all propositions are logical compounds of—and thusanalyzable into—atomic propositions that are inferentiallyindependent of one another; though (iii) even simple natural languagesentences may correspond to very complex propositions; and (iv) theright analyses would, given a little reflection, reveal allphilosophical puzzles as confusions about how language is related tothe world. Wittgenstein later noted that examples like ‘This isred’ and ‘This is yellow’ present difficulties forhis earlier view. (If the expressed propositions are unanalyzable, andthus logically independent, each should be compatible with the other;but at least so far, no one has provided a plausible analysis thataccounts for the apparent impeccability of ‘This is red, so thisis not yellow’. This raises questions about whetherallinferential security is due to logical form.) And in any case, Russelldid not endorse (iv). But he did say, for reasons related to certainepistemological puzzles, that (a) we aredirectly acquaintedwith the constituents of those propositions into which everyproposition (that we can grasp) can be analyzed; (b) at leasttypically, we are not directly acquainted with the mind-independentbearers of proper names; and so (c) the things we typically refer towith names are not constituents of basic propositions.

This led Russell to say that natural language names are disguiseddescriptions. On this view, ‘Hesperus’ is semanticallyassociated with a complex predicate—say, for illustration, apredicate of the form ‘\(E(x) \land S(x)\)’, suggesting‘evening star’. In which case, ‘Hesperus isbright’ expresses a proposition of the form

\[\lsquo\exists x \{[E(x) \land S(x)] \land\forall y \{[E(y) \land S(y)] \rightarrow y = x\} \landB(x)\}\rsquo.\]

It also follows that Hesperus exists iff\(\exists x [E(x) \land S(x)]\); and this would be challenged byKripke (1980) and others; see the entries onrigid-designators andnames. But by analyzing names as descriptions—quantificational expressions, as opposed to logicalconstants (like ‘\(b\)’) that indicateindividuals—Russell offered an attractive account of why theproposition that Hesperus is bright differs from the proposition thatPhosphorus is bright. Instead of saying that propositionalconstituents are Fregean senses, Russell could say that‘Phosphorus is bright’ expresses a proposition of theform

\[\lsquo\exists x \{[M(x) \land S(x)] \land\forall y \{[M(y) \land S(y)] \rightarrow y = x\} \landB(x)\}\rsquo ;\]

where ‘\(E(x)\)’ and‘\(M(x)\)’ indicate different functions, specified(respectively) in terms of evenings and mornings. This leaves room forthe discovery that the complex predicates ‘\(E(x) \landS(x)\)’ and ‘\(M(x) \land S(x)\)’ both indicatefunctions that map Venus and nothing else to thetruth-valueT. The hypothesis was that thepropositions expressed with ‘Hesperus is bright’ and‘Phosphorus is bright’ have different (fundamental)constituents, even though Hesperus is Phosphorus, but not becausepropositional constituents are “ways of presenting”Bedeutungen. Similarly, the idea was that the propositions expressedwith ‘Hesperus is Hesperus’ and ‘Hesperus isPhosphorus’ differ, because only the latter haspredicational/unsaturated constituents corresponding to‘Phosphorus’. Positing unexpected logical forms seemed tohave explanatory payoffs.

Questions about names and descriptions are also related topsychological reports, like ‘Mary thinks Venus is bright’,which present puzzles of their own; see the entry onpropositional attitude reports. Such reports seem to indicate propositions that are neither atomicnor logical compounds of simpler propositions. For as Frege noted,replacing one name with another name for the same object canapparently affect the truth of a psychological report. If Mary failsto know that Hesperus is Venus, she might think Venus is a planetwithout thinking Hesperus is a planet; though cp. Soames (1987, 1995,2002) and see the entry onsingular propositions. Any function that has the valueT given Venus asargument has the valueT given Hesperus as argument.So Frege, Russell, and Wittgenstein all held—in varyingways—that psychological reports are also misleading with respectto the logical forms of the indicated propositions.

6. Regimentation and Communicative Slack

Within the analytic tradition inspired by these philosophers, itbecame a commonplace that logical form and grammatical form typicallydiverge, often in dramatic ways. This invited attempts to provide bothanalyses of propositions and claims about natural language, with theaim of saying how relatively simple sentences (with subject-predicatestructures) could be used to express propositions (withfunction-argument structures).

The logical positivists explored the idea that the meaning of asentence is a procedure for determining the truth or falsity of thatsentence. From this perspective, studies of linguistic meaning andpropositional structure still dovetail, even if natural languageemploys “conventions” that make it possible to indicatecomplex propositions with grammatically simple sentences; see theentry onanalysis. But to cut short a long and interesting story, there was littlesuccess in formulating “semantic rules” that wereplausible both as (i) descriptions of how ordinary speakers understandsentences of natural language, and (ii) analyses that revealed logicalstructure of the sort envisioned. (And until Montague [1970],discussed briefly in the next section, there was no real progress inshowing how to systematically associate quantificational constructionsof natural language with Fregean logical forms.)

Rudolf Carnap, one of the leading positivists, responded to difficulties facing hisearlier views by developing a sophisticated position according towhich philosophers could (and should) articulate alternative sets ofconventions for associating sentences of a language with propositions.Within each such language, the conventions would determine whatfollows from what. But one would have to decide, on broadly pragmaticgrounds, which interpreted language was best for certain purposes(like conducting scientific inquiry). On this view, questions about“the” logical form of an ordinary sentence are in partquestions about which conventions one should adopt. The idea was that“internal” to any logically perspicuous linguistic scheme,there would be an answer to the question of how two sentences areinferentially related. But “external” questions, aboutwhich conventions we should adopt, would not be settled by descriptivefacts about how we understand languages that we already use.

This was, in many ways, an attractive development of Frege’svision. But it also raised a skeptical worry. Perhaps the structuralmismatches between sentences of a natural language and sentences of aFregean Begriffsschrift are so severe that one cannot formulategeneral rules for associating the sentences we ordinarily use withpropositions. Later theorists would combine this view with the ideathat propositions are sentences of amental language that is relevantly like Frege’s invented language andrelevantly unlike the spoken languages humans use to communicate; seeFodor (1975, 1978). But given the rise ofbehaviorism, both in philosophy and psychology, this variant on a medieval ideawas initially ignored or ridiculed. (And it does face difficulties;seesection eight.)

Willard Van Orman Quine combined behaviorist psychology with a normative conception oflogical form similar to Carnap’s. The result was an influentialview according to which there is no fact of the matter about whichproposition a speaker/thinker expresses with a sentence of naturallanguage, because talk of propositions is (at best) a way of talkingabout how we should regiment our verbal behavior for certainpurposes—and in particular, for purposes of scientific inquiry.On this view, claims about logical form are evaluative, and suchclaims are underdetermined by the totality of facts concerningspeakers’ dispositions to use language. From this perspective,mismatches between logical and grammatical form are to be expected,and we should not conclude that ordinary speakers have mentalrepresentations that are isomorphic with sentences of a FregeanBegriffsschrift.

According to Quine, speakers’ behavioral dispositions constrainwhat can be plausibly said about how to best regiment their language.He also allowed for some general constraints on interpretability thatan idealized “field linguist” might impose in coming upwith a regimented interpretation scheme. (Donald Davidson developed a similar line of thought in a less behavioristic idiom,speaking in terms of constraints on a “RadicalInterpreter,” who seeks “charitable” construals ofalien speech.) But unsurprisingly, this left ample room for“slack” with respect to which logical forms should beassociated with a given sentential utterance.

Quine also held that decisions about how to make such associationsshould be madeholistically. As he sometimes put it, the“unit of translation” is an entire language, not aparticular sentence. On this view, one can translate a sentenceS of a natural language NL with a structurally mismatchingsentence µ of a formal language FL, even if it seems (locally)implausible thatS is used to express the propositionassociated with µ, so long as the following condition is met:the association betweenS and µ is part of a generalaccount of NL and FL that figures in an overall theory—whichincludes an account of language, logic, and the language-independentworld—that is among the best overall theories available. Thisholistic conception of how to evaluate proposed regimentations ofnatural language was part and parcel of Quine’s criticism of theearly positivists’analytic-synthetic distinction, and his more radical suggestion that there is no such distinction

The suggestion was that even apparently tautologous sentences, like‘Bachelors are unmarried’ and ‘Caesar died if Brutuskilled him’, have empirical content. These may be among the lastsentences we would dissent from, faced with recalcitrant experience;we may prefer to say that Caesar didn’t really die, or thatBrutus didn’t really kill him, if the next best alternative isto deny the conditional claim. But for Quine, every meaningful claimis a claim that could turn out to be false—and so a claim wemust be prepared, at least in principle, to reject. Correlatively, nosentences are known to be true simply by knowing what they mean andknowinga priori that sentences with such meanings must betrue.

For present purposes, we can abstract away from the details of debatesabout whether Quine’s overall view was plausible. Here, theimportant point is that claims about logical form were said to be (atleast partly) claims about the kind of regimented language weshould use, not claims about the propositions actuallyexpressed with sentences of natural language. And one aspect ofQuine’s view, about the kind of regimented language weshould use, turned out to be especially important forsubsequent discussions of logical form. For even among those whorejected the behavioristic assumptions that animated Quine’sconception of language, it was often held that logical forms areexpressions of a first-order predicate calculus.

Frege’s Begriffsschrift, recall, was designed to capture theDedekind-Peano axioms for arithmetic, including the axiom ofinduction; see the entry onFrege’s theorem and foundations for arithmetic. This required quantification into positions occupiable by predicates,as well as positions occupiable by names. Using modern notation, Fregeallowed for formulae like ‘\((Fa \land Fb) \rightarrow \exists X (Xa \land Xb)\)’and ‘\(\forall x \forall y [x = y \leftrightarrow \forall X (Xx \leftrightarrow Xy)]\)’.And he took second-orderquantification to be quantification over functions. This is to say,for example, that ‘\(\exists X (Xa \land Xb)\)’ is true iff:there is a function, \(X\), that maps both the individual called‘\(a\)’ and the individual called ‘\(b\)’ onto thetruth-valueT. Frege also took it to be a truth oflogic that for any predicate \(P\), there is a function such thatfor each individual \(x\), that function maps \(x\) toT iff\(x\) satisfies (or “falls under”) \(P\). In which case,for each predicate, there is the set of all and only the things thatsatisfy the predicate. The axioms for Frege’s logic thusgeneratedRussell’s paradox, given predicates like ‘is not a member of itself’. Thisinvited attempts toweaken the axioms, while preservingsecond-order quantification. But for various reasons, Quine and othersadvocated a restriction to a first-order fragment of Frege’slogic, disallowing quantification into positions occupied bypredicates. (Kurt Gödel had proved thecompleteness of first-order predicate calculus, thus providing apurely formal criterion for what followed from what in thatlanguage. Quine also held that second-order quantification illicitlytreated predicates as names for sets, thereby spoiling Frege’sconception of propositions as unified by virtue of having unsaturatedpredicational constituents that are satisfied by things denoted bynames.) On Quine’s view, we shouldreplace

\[\lsquo (Fa \land Fb) \rightarrow \exists X(Xa \land Xb)\rsquo\]

with explicit first-order quantificationover sets, as in

\[\lsquo (Fa \land Fb) \rightarrow\exists s (a \in s \land b \in s)\rsquo ;\]

where ‘\(\in\)’ stands for ‘is an element of’,and this second conditional is not a logical truth, but rather ahypothesis (to be evaluated holistically) concerning sets.

The preference for first-order regimentations has come to seemunwarranted, or at least highly tendentious; see Boolos (1998). But itfueled the idea that logical form can diverge wildly from grammaticalform. For as students quickly learn, first-order regimentations ofnatural sentences often turn out to be highly artificial. (And in somecases, such regimentations seem to be unavailable.) This was, however,taken to show that natural languages are far from ideal for purposesof indicating logical structure.

A different strand of thought in analytic philosophy—pressed byWittgenstein inPhilosophical Investigations and developed byothers, includingPeter Strawson andJohn Austin—also suggested that a single sentence could be used (on differentoccasions) to express different kinds of propositions. Strawson (1950)argued thatpace Russell, a speaker could use an instance of‘TheF isG’ to express a singularproposition about a specific individual: namely, theF in thecontext at hand. According to Strawson, sentences themselves do nothave truth conditions, since sentences (as opposed to speakers) do notexpress propositions; and speakers can use ‘The boy istall’ to express a proposition with the contextually relevantboy as a constituent. Donnellan (1966) went on to argue that a speakercould even use an instance of ‘TheF isG’ to express a singular proposition about anindividual that isn’t anF; see the entry onreference. Such considerations, which have received a great deal of attention inrecent discussions of context dependence, suggested that relationsbetween natural language sentences and propositions are (at best) verycomplex and mediated by speakers’ intentions. All of which madeit seem that such relations are far more tenuous than the pre-Fregeantradition suggested. This bolstered the Quine/Carnap idea thatquestions about the structure of premises and conclusions are reallyquestions about how weshould talk (when trying to describethe world), much as logic itself seems to be more concerned with howwe should infer than with how we do infer. From this perspective, theconnections between logic and grammar seemed rather shallow; seeIacona (2018) for extended discussion.

7. Notation and Restricted Quantification

On the other hand, more recent work on quantifiers suggests that thedivergence had been exaggerated, in part because of how Frege’sidea of variable-binding was originally implemented. Consider againthe proposition that some boy sang, and the proposed logical divisioninto the quantifier and the rest: \(\exists x [\textrm{Boy}(x) \land \textrm{Sang}(x)]\);something is both a boy and an individual that sang. This is one wayto regiment the English sentence. But one can also offer a logicalparaphrase that more closely parallels the grammatical divisionbetween ‘some boy’ and ‘sang’: for someindividual \(x\) such that \(x\) is a boy, \(x\) sang. One can formalize thisparaphrase with restricted quantifiers, which incorporate arestriction on the domain over which the variable in question ranges.For example, ‘\(\exists x{:}B(x)\)’ can be an existentialquantifier that binds a variable ranging over the boys in the relevantdomain, with ‘\(\exists x{:}B(x) [S(x)]\)’ being true iff some boysang. Since ‘\(\exists x{:}B(x) [S(x)]\)’ and ‘\(\exists x [B(x) \land S(x)]\)’are logically equivalent, logic provides no reasonfor preferring the latter regimentation of the English sentence. Andchoosing the latter does not show that the proposition expressed with‘Some boy sang’ has a structure that differs fromgrammatical structure of the sentence.

Universal quantifiers can also be restricted, as in‘\(\forall x{:}B(x) [S(x)]\)’, interpreted as follows: for everyindividual \(x\) such that \(x\) is a boy, \(x\) sang. Restrictors can also belogically complex, as in ‘Some boy from Canada sang’ or‘Some boy who respects Mary sang’, rendered as‘\(\exists x{:}B(x) \land F(x, c)[S(x)]\)’ and‘\(\exists x{:}B(x) \land R(x, m)[S(x)]\)’. Given these representations, the inferentialdifference between ‘some boy sang’ and ‘every boysang’ lies with the propositional contributions of‘some’ and ‘every’ after all, and not partlywith the contribution of connectives like ‘\(\land\)’ and‘\(\rightarrow\)’.

Words like ‘someone’, and the grammatical requirement that‘every’ must be followed by a noun (or noun phrase),suggest that natural language employs restricted quantifiers. Phraseslike ‘every boy’ are composed of a determiner and a noun.Correspondingly, one can think of determiners as expressions that cancombine with an ordered pair of predicates to form a sentence, much asone can think of transitive verbs as expressions that can combine withan ordered pair of names to form a sentence. And this grammaticalanalogy, between determiners and transitive verbs, invites a semanticcorrelate.

Since ‘\(x\)’ and ‘\(y\)’ are variables ranging overindividuals, one can say that the function indicated by the transitiveverb ‘likes’ yields the valueT given theordered pair \(\langle x,y \rangle\) as argument if and only if \(x\) likes \(y\). Inthis notational scheme, ‘\(y\)’ corresponds to the directobject (or internal argument), which combines with the verb to form aphrase; ‘\(x\)’ corresponds to the grammatical subject (orexternal argument) of the verb. If we think about ‘every boysang’ analogously, ‘boy’ is the internal argument of‘every’, since ‘every boy’ is a phrase. Bycontrast, ‘boy’ and ‘sang’ do not form aphrase in ‘every boy sang’. So let us introduce‘\(X\)’ and ‘\(Y\)’ as second-order variables rangingover functions, from individuals to truth values, stipulating that theextension of such a function is the set of things that the functionmaps onto the truth valueT. Then one can say thatthe function indicated by ‘every’ yields the valueT given the ordered pair \(\langle X,Y \rangle\) as argumentiff the extension of \(X\) includes the extension of \(Y\). Similarly, one cansay that the function indicated by ‘some’ maps the orderedpair \(\langle X, Y \rangle\) ontoT iff the extension of \(X\)intersects with the extension of \(Y\).

Just as we can describe ‘likes’ as a predicate satisfiedby ordered pairs \(\langle x, y \rangle\) such that \(x\) likes \(y\), so we can thinkabout ‘every’ as a predicate satisfied by ordered pairs\(\langle X, Y \rangle\) such that the extension of \(X\) includes the extensionof \(Y\). (This is compatible with thinking about ‘every boy’as a restricted quantifier that combines with a predicate to form asentence that is true iff every boy satisfies that predicate.) Onevirtue of this notational scheme is that it lets us representrelations between predicates that cannot be captured with‘\(\forall\)’, ‘\(\exists\)’, and the sententialconnectives; see Rescher (1962), Wiggins (1980). For example, mostboys sang iff the boys who sang outnumber the boys who did not sing.So we can say that ‘most’ indicates a function that maps\(\langle X, Y \rangle\) toT iff the number of things thatboth \(Y\) and \(X\) map toT exceeds the number of thingsthat \(Y\) but not \(X\) maps toT.

Using restricted quantifiers, and thinking about determiners asdevices for indicating relations between functions, also suggests analternative to Russell’s treatment of ‘the’. Theformula

\[\lsquo \exists x \{B(x) \land \forall y[B(y) \rightarrow x = y] \land S(x)\}\rsquo\]

can be rewrittenas‘\(\exists x{:}B(x)[S(x)] \land |B| =1\)’, interpreted as follows: for some individual \(x\)such that \(x\) is a boy, \(x\) sang; and the number of (relevant)boys is exactly one. On this view, ‘the boy’ still doesnot correspond to a constituent of the formalism; nor does‘the’. But one can depart farther from Russell’snotation, while emphasizing his idea that ‘the’ isrelevantly like ‘some’ and ‘every’. For onecan analyze ‘the boy sang’ as‘\(!x:\textrm{Boy}(x)[\textrm{Sang}(x)]\)’, specifying thepropositional contribution of ‘\(!\)’—on a par withas ‘\(\exists\)’ and ‘\(\forall\)’—asfollows:

\[!x:Y(x)[X(x)] = \textbf{T} \text{ iff the extensions of } X \text{ and } Y \text{ intersect & } |Y| = 1.\]

This way of encoding Russell’s theory preserves his centralclaim. While there may be a certain boy that a speaker refers to insaying ‘The boy sang’, that boy is not a constituent ofthe quantificational proposition expressed with‘\(!x:\textrm{Boy}(x)[\textrm{Sang}(x)]\)’; see Neale (1990) for discussion. Butfar from showing that the logical form of ‘The boy sang’diverges dramatically from its grammatical form, therestricted quantifier notation suggests that the logical form closelyparallelsthe grammatical form. For ‘the boy’ and‘the’ do correspond to constituents of‘\(!x:B(x)[S(x)]\)’, at least if we allow for logical formsthat represent quantificational propositions in terms of second-orderrelations; see Montague (1970), and for discussion of relevantconstraints on how such relations can be expressed withquantificational determiners, see Barwise and Cooper (1981),Higginbotham and May (1981), Keenan (1996), and the article ongeneralized quantifiers.

It is worth noting, briefly, a potential implication for inferenceslike ‘The boy sang, so some boy sang’. If the logical formof ‘The boy sang’ is

\[\lsquo\exists x{:}B(x) [S(x)] \land |B| = 1\rsquo ,\]

then the inference is aninstance of the schema ‘\(\textbf{A} \land \textbf{B}\), so\(\textbf{A}\)’. But if the logical form of ‘The boysang’ is simply ‘\(!(x){:}B(x)[S(x)]\)’, the premiseand conclusion have the same form, differing only by substitution of‘\(!\)’ for ‘\(\exists\)’. In which case, theimpeccability of the inference depends on the specific contributionsof ‘the/\(!\)’ and ‘some/\(\exists\)’. Onlywhen these contributions are “spelled out,” perhaps interms of set-intersection, would the validity of the inference bemanifest; see, e.g., King (2002). So even if grammar and logic do notdiverge in this case, one might say that grammatical structure doesnotreveal the logical structure. From this perspective, furtheranalysis of ‘the’ is required. Those who are skeptical ofan analytic/synthetic distinction can say that it remains more adecision than a discovery to say that ‘Some boy sang’follows from ‘The boy sang’. In general, and especiallywith regard to aspects of propositional form indicated with individualwords, issues about logical form are connected with issues about theanalytic-synthetic distinction.

8. Transformational Grammar

Even given restricted quantifiers (and acceptance of second-orderlogical forms), the subject/predicate structure of ‘Juliet /likes every doctor’ diverges from the corresponding formulabelow.

\[\forall y{:}\textrm{Doctor}(y) [\textrm{Likes}(\textrm{Juliet}, y)]\]

We can rewrite ‘\(\textrm{Likes}(\textrm{Juliet}, y)\)’ as‘\([\textrm{Likes}(y)](\textrm{Juliet})\)’, to reflect thefact that ‘likes’ combines with a direct object to form aphrase, which in turn combines with a subject. But this does notaffect the main point: ‘every’ seems to be a grammaticalconstituent of the verb phrase ‘likes every doctor’; yetit also seems to indicate the main quantifier of the expressedproposition. In natural language, ‘likes’ and ‘everydoctor’ form a phrase. But with respect to logical form, it isas if ‘likes’ combines with ‘Juliet’ and avariable to form a complex predicate that is in turn an externalargument of the higher-order predicate ‘every’. Similarremarks apply to ‘Some boy likes every doctor’ and

\[\lsquo[\exists x{:}\textrm{Boy}(x)][\forall y{:}\textrm{Doctor}(y)]\{\textrm{Likes}(x,y)\}\rsquo.\]

So itseems that mismatches remain in the very places that troubled medievallogicians—viz., quantificational direct objects and otherexamples of complex predicates with quantificational constituents.

Montague (1970, 1974) showed that these mismatches do not precludesystematic connections of natural language sentences with thecorresponding propositional structures. Abstracting from the technicaldetails, one can specify an algorithm that pairs each natural languagesentence that contains one or more quantificational expressions like‘every doctor’ with one or more Fregean logical forms.This was a significant advance. Together with subsequent developments,Montague’s work showed that Frege’s logic was compatiblewith the idea that quantificational constructions in natural languagehave a systematic semantics. Indeed, one can use Frege’s formalapparatus to study such constructions. Montague maintained that thesyntax of natural language was, nonetheless, misleading for purposesof (what he took to be) real semantics. On this view, the study ofvalid inference still suggests that grammar disguises the structure ofpropositional thought. But in thinking about the relation of logic togrammar, one should not assume a naive conception of the latter.

For example, the grammatical form of a sentence need not be determinedby the linear order of its words. Using brackets to disambiguate, wecan distinguish the sentence ‘Mary [saw [the [boy [withbinoculars]]]]’—whose direct object is ‘the boy withbinoculars’—from the homophonous sentence ‘Mary[[saw [the boy]] [with binoculars]]’, in which ‘saw theboy’ is modified by an adverbial phrase. The first implies thatthe boy had binoculars, while the second implies that Mary usedbinoculars to see the boy. Even if this distinction is not audiblymarked, there is a significant difference between modifying a noun(like ‘boy’) with a prepositional phrase and modifying averb phrase (‘saw the boy’). More generally, grammaticalstructure need not be obvious. Just as it may take work to discoverthe kind(s) of structure that propositions exhibit, so it may takework to discover the kind(s) of structure that sentences exhibit. Andmany studies of natural language suggest a rich conception ofgrammatical form that diverges from traditional views; see especiallyChomsky (1957, 1964, 1965, 1981, 1986, 1995). So we need to ask howlogical forms are related to actual grammatical forms, which linguiststry to discover, since these may differ importantly from anyhypothesized grammatical forms that may be suggested by casualreflection on spoken language. Appearances may be misleading withrespect to both grammatical and logical form, leaving room for thepossibility that these notions of structure are not so different afterall.

A leading idea of modern linguistics is that at least some grammaticalstructures are transformations of others. Put another way, linguisticexpressions often appear to be displaced from positions canonicallyassociated with certain grammatical relations that the expressionsexhibit. For example, the word ‘who’ in (17) is apparentlyassociated with the internal (direct object) argument position of theverb ‘saw’.

(17)
Mary wondered who John saw

Correspondingly, (17) can be glossed as ‘Mary wondered whichperson is such that John saw that person’. This invites thehypothesis that (17) reflects a transformation of the “DeepStructure” (17D) into the “Surface Structure” (17S);where the subscripts indicate that ‘who’ bears a certaingrammatical relation, often called “movement,” to thecoindexed position.

(17D)
{Mary [wondered {John [saw who]}]}
(17S)
{Mary [wondered [whoi {John [saw( … )i ]}]]}

The idea is that the embedded clause in (17D) has the same form as‘John saw Bill’, but in (17S), ‘who’ has beendisplaced from its original argument position. Similar remarks applyto the question ‘Who did John see’ and otherquestion-words like ‘why’, ‘what’,‘when’, and ‘how’.

One might also try to explain the synonymy of (18) and (19) bypositing a common deep structure, (18D).

(18)
John seems to like Mary
(19)
It seems John likes Mary
(18D)
[Seems{John [likes Mary]}]
(18S)
{Johni [seems { ( _ )i [to like Mary]}]}

If every English sentence needs a subject of some kind, (18D) must bemodified: either by displacing ‘John’, as in (18S); or byinserting a pleonastic subject, as in (19). Note that in (19),‘It’ does not indicate an argument; compare‘There’ in ‘There is something in the garden’.Appeal to displacement also lets one distinguish the superficiallyparallel sentences (20) and (21).

(20)
John is easy to please
(21)
John is eager to please

If (20) is true, John is easily pleased. In which case, it is easy(for someone) to please John; and here, ‘it’ ispleonastic. But if (21) is true, John is eager that he please someoneor other. This asymmetry is effaced by representations like‘Easy-to-please(John)’ and‘Eager-to-please(John)’. The contrast is made manifest,however, with (20S) and (21S); where ‘e’ indicates anunpronounced argument position.

(20S)
{Johni [is easy { e [to please (_ )i ]}]}
(21S)
{Johni [is eager { ( _)i [to please e ]}]}

It may be that in (21S), which does not mean that it is eager for Johnto please someone, ‘John’ is grammatically linked to thecoindexed position without being displaced from that position. Butwhatever the details, the “surface subject” of a sentencecan be the object of a verb embedded within the main predicate, as in(20S). Of course, such hypotheses about grammatical structure requiredefense. But Chomsky and others have long argued that such hypothesesare needed to account for various facts concerning human linguisticcapacities; see, e.g., Berwick et.al. (2011). As an illustration ofthe kind of data that is relevant, note that while (22–24) areperfectly fine as expressions of English, (25) is not.

(22)
The boy who sang was happy
(23)
Was the boy who sang happy
(24)
The boy who was happy sang
(25)
*Was the boy who happy sang

This suggests that the auxiliary verb ‘was’ can bedisplaced from some positions but not others. That is, while (23S) isa permissible transformation of (22D), (25S) is not a permissibletransformation of (24D).

(22D)
{[The [boy [who sang]]] [was happy]}
(23S)
Wasi {[the [boy [who sang]]] [( _ )i happy]}
(24D)
{[The [boy [who [was happy]]]] sang}
(25S)
*Wasi {[the [boy [who [ ( _)i happy]]]] sang}

In (25), the asterisk indicates intuitive deviance; in (25S), itindicates the hypothesized source of this deviance—viz., thatthe auxiliary verb cannot be displaced from the embedded relativeclause. The ill-formedness of (25S) is striking, since one cansensibly ask whether or not the boy who was happy sang. One can alsoask whether or not (26) is true. But (27) is not the yes/no questioncorresponding to (26).

(26)
The boy who was lost kept crying
(27)
Was the boy who lost kept crying

Rather, (27) is the yes/no question corresponding to ‘The boywho lost was kept crying’, which has an unexpected meaning. Sowe want some account of why (27) cannot have the interpretationcorresponding to (26). But this “negative fact” concerning(27) is precisely what one would expect if ‘was’ cannot bedisplaced from its position in (26), as in the following logicallypossible but grammatically illicit structure: *Wasi{[the [boy [who [( _ )i lost]]]] [kept crying]}.

By contrast, if we merely specify an algorithm that associates (27)with its actual meaning—or if we merely hypothesize that (27) isthe English translation of a certain mental sentence—we have notyet explained why (27) cannot also be used to ask whether or not (26)is true. Explanations of such facts appeal to nonobvious grammaticalstructure, and constraints on natural language transformations. (Forexample, an auxiliary verb in a relative clause cannot be“fronted;” though of course, theorists try to find deeperexplanations for such constraints.)

The idea was that a sentence has both a deep structure (DS), whichreflects semantically relevant relations between verbs and theirarguments, and a surface structure (SS) that may include displaced (orpleonastic) elements. In some cases, pronunciation might depend onfurther transformations of SS, resulting in a distinct“phonological form” (PF). Linguists posited variousconstraints on these levels of grammatical structure, and thetransformations that relate them. But as the theory was elaborated andrefined under empirical pressure, various facts that apparently calledfor explanation in these terms still went unexplained. This suggestedanother level of grammatical structure, obtained by a different kindof transformation on SS. The hypothesized level was called‘LF’, intimating ‘Logical Form’; and thehypothesized transformation—called ‘QuantifierRaising’ because it targeted the kinds of expressions thatindicate (restricted) quantifiers—mapped structures like (28S)onto structures like (28L).

(28S)
{Juliet [likes [every doctor]]}
(28L)
{[every doctor]i {Juliet [likes ( _ )i ]}}

Clearly, (28L) does not reflect the pronounced word order in English.But the idea was that PF determines pronunciation, while LF was saidto be the level at which the scope of a natural language quantifier isdetermined; see May (1985). If we think about ‘every’ as akind of second-order transitive predicate, which can combine with twopredicates like ‘doctor’ and ‘Julietlikes ( _)i to form a complete sentence, we shouldexpect that at some level of analysis, the sentence ‘Julietlikes every doctor’ has the structure indicated in (28L). Andmapping (28L) to the Fregean logicalform

\[\lsquo [\forall x{:}\textrm{Doctor}(x)]\{\textrm{Likes}(\textrm{Juliet},x)\}\rsquo\]

is trivial. Similarly, consider the following:

(29S)
{[some boy] [likes [every doctor]]}
(29L)
{[some boy]i {[everydoctor]jj {( _ )i [likes ( _)j ]}}
(29L′)
{[every doctor]j {[someboy]i { ( _ )i[likes ( _)j ]}}}

If the surface structure(29S) can be mapped onto either (29L) or (29L′), then (29S) canbe easily mapped onto the Fregean logicalforms

\[\lsquo[\exists x{:}\textrm{Boy}(x)][\forally{:}\textrm{Doctor}(y)]\{\textrm{Likes}(x,y)\}\rsquo\]

and

\[\lsquo [\forall y{:}\textrm{Doctor}(y)][\exists x{:} \textrm{Boy}(x)]\{\textrm{Likes}(x,y)\}\rsquo .\]

This assimilates quantifier scope ambiguity to the structuralambiguity of examples like ‘Juliet saw the boy withbinoculars’. More generally, many apparent examples ofgrammar/logic mismatches were rediagnosed as mismatches betweendifferent aspects of grammatical structure—between those aspectsthat determine pronunciation, and those that determine interpretation.In one sense, this is fully in keeping with the idea that in naturallanguage, “surface appearances” are often misleading withregard to propositional structure. But it also makes room for the ideathat grammatical structure and logical structure converge, in waysthat can be discovered through investigation, once we move beyondtraditional subject-predicate conceptions of structure with regard toboth logic and grammar.

There is independent evidence for “covert”transformations—displacement of expressions from their audiblepositions, as in (28L); see Huang (1995), Hornstein (1995). Consider‘Jean a vu qui’, which is the French translation of‘Who did John see’. If we assume that ‘qui’(‘who’) is displaced at LF, then we can explain why thequestion-word is understood in both French and English like aquantifier binding a variable: which person \(x\) is such that John saw \(x\)?Similarly, example (30) from Chinese is transliterated as in (31).

(30)
Zhangsan zhidao Lisi mai-te sheme
(31)
Zhangsan know Lisi bought what

But (30) is ambiguous, between the interrogative (31a) and the complexdeclarative (31b).

(31a)
Which thing is such that Zhangsan knows Lisi boughtit
(31b)
Zhangsan knows which thing (is such that) Lisibought (it)

This suggests covert displacement of the quantificationalquestion-word in Chinese; see Huang (1982, 1995). Chomsky (1981) alsoargued that the constraints on such displacement can help explaincontrasts like the one illustrated with (32) and (33).

(32)
Who said he has the best smile
(33)
Who did he say has the best smile

In (32), the pronoun ‘he’ can have a bound-variablereading: which person \(x\) is such that \(x\) said that \(x\) has the best smile.This suggests that the following grammatical structure is possible:Whoi {[( )i said[hei has the best smile]]}. But (33) cannot be usedto ask this question, suggesting that some linguistic constraint rulesout the following logically possible structure:*Whoi [did {[hei say [()i has the best smile]]]. And there cannot beconstraints on transformations without transformations. So if Englishovertly displaces question-words that are covertly displaced in otherlanguages, we should not be too surprised if English covertlydisplaces other quantificational expressions like ‘everydoctor’. Likewise, (34) has the reading indicated in (34a) butnot the reading indicated in (34b).

(34)
It is false that Juliet likes every doctor
(34a)
\(\neg \forall x{:}\textrm{Doctor}(x) [\textrm{Likes}(\textrm{Juliet}, x)]\)
(34b)
\(\forall x{:}\textrm{Doctor}(x) \neg [\textrm{Likes}(\textrm{Juliet}, x)]\)

This suggests that ‘every doctor’ gets displaced, but onlyso far. Similarly, (13) cannot mean that every doctor is such that nopatient who saw that doctor is healthy.

(13)
No patient who saw every doctor is healthy

As we have already seen, English seems to abhor fronting certainelements from within an embedded relative clause. This invites thehypothesis that Quantifier Raising is subject to a similar constraint,and hence, that many quantificational expressions get displaced inEnglish. This hypothesis is not uncontroversial; see, e.g., Jacobson(1999). But many linguists (following Chomsky [1995, 2000]) would nowposit only two levels of grammatical structure, corresponding to PFand LF—the thought being that constraints on DS and SS can beeschewed in favor of a simpler theory that only posits constraints onhow expressions can be combined in the course of constructing complexexpressions that can be pronounced and interpreted. If thisdevelopment of earlier theories proves correct, then the onlysemantically relevant level of grammatical structure often reflectscovert displacement of audible expressions; see, e.g., Hornstein(1995). In any case, there is a large body of work suggesting thatmany logical properties of quantifiers, names, and pronouns arereflected in properties of LF.

For example, if (35) is true, it follows that some doctor treated somedoctor; whereas (36) does not have this consequence:

(35)
Every boy saw the doctor who treated himself
(36)
Every boy saw the doctor who treated him

The meanings of (35) and (36) seem to be roughly as indicated in (35a)and (36a); where ‘\(!\)’ indicates the contribution of‘the’.

(35a)
\([\forall x{:}\textrm{Boy}(x)][!y{:}\textrm{Doctor}(y) \land \textrm{Treated}(y,y)] \{\textrm{Saw}(x,y)\}\)
(36a)
\([\forall x{:}\textrm{Boy}(x)][!y{:}\textrm{Doctor}(y) \land \textrm{Treated}(y,x)] \{\textrm{Saw}(x,y)\}\)

This suggests that ‘himself’ is behaving like a variablebound by ‘the doctor’, while ‘every boy’ canbind ‘him’. And there are independent grammatical reasonsfor saying that ‘himself’ must be linked to ‘thedoctor’, while ‘him’ must not be so linked. Notethat in ‘Pat thinks Chris treated himself/him’, theantecedent of ‘himself’ must be the subject of‘treated’, while the antecedent of ‘him’ mustnot be; see Chomsky (1981).

We still need to enforce the conceptual distinction between LF and thetraditional notion of logical form. There is no guarantee thatstructural features of natural language sentences will mirror thelogical features of propositions; cp. Stanley (2000), King (2007). Butthis leaves room for the empirical hypothesis that LF reflects atleast a great deal of propositional structure; see Harman (1972),Higginbotham (1986), Segal (1989), Larson and Ludlow (1993), and theessay onstructured propositions. Moreover, even if the LF of a sentenceS underdetermines the logicalform of the proposition a speaker expresses withS (on a givenoccasion of use), the LF may provide a “scaffolding” thatcan be elaborated in particular contexts, with little or no mismatchbetween grammatical and propositional architecture. If some such viewis correct, it might avoid certain (unpleasant) questions prompted byearlier Fregean views: how can a sentence be used to express aproposition with a radically different structure; and if grammar isdeeply misleading, why think that our intuitions concerningimpeccability provide reliable evidence about which propositionsfollow from which? These are, however, issues that remainunsettled.

9. Semantic Structure and Events

If propositions are the “things” that really have logicalform, and sentences of English are not themselves propositions, thensentences of English “have” logical forms only byassociation with propositions. But if the meaning of a sentence issome proposition—or perhaps a function from contexts topropositions—then one might say that the logical form“of” a sentence is its semantic structure (i.e., thestructure of that sentence’s meaning). Alternatively, one mightsuspect that in the end, talk of propositions is just convenientshorthand for talking about the semantic properties of certainsentences: perhaps sentences of a Begriffsschrift, or sentences ofmentalese, or sentences of natural languages (abstracting away fromtheir logically/semantically irrelevant properties). In any case, thenotion of logical form has played a significant role in recent work ontheories of meaning for natural languages. So an introductorydiscussion of logical form would not be complete without some hint ofwhy such work is relevant, especially since attending to details ofnatural languages (as opposed to languages invented to study thefoundations of arithmetic) led to renewed discussion of how torepresent propositions that involve relations.

Prima facie, ‘Every old patient respects some doctor’ and‘Some young politician likes every liar’ exhibit commonmodes of linguistic combination. So a natural hypothesis is that themeaning of each sentence is fixed by these modes of combination, giventhe relevant word meanings. It may be hard to see how this hypothesiscould be true if there are widespread mismatches between logical andgrammatical form. But it is also hard to see how the hypothesis couldbe false. Children, who have finite cognitive resources, typicallyacquire the capacity to understand the endlessly many expressions ofthe languages spoken around them. A great deal of recent work hasfocussed on these issues, concerning the connections between logicalform and the senses in which natural languages are semanticallycompositional.

It was implicit in Frege that each of the endlessly many sentences ofan ideal language would have a compositionally determinedtruth-condition. Frege did not actually specify an algorithm thatwould associate each sentence of his Begriffsschrift with itstruth-condition. ButTarski (1933) showed how to do this for the first-order predicate calculus, focussing on interesting cases of multiple quantification like the one shown below:

\[\begin{align}\forall x &[\textrm{Number}(x) \:\rightarrow \\ & \exists y [\textrm{SuccessorOf}(y, x) \land \forall z [\textrm{SuccessorOf}(z, x) \rightarrow z = y]]]\end{align}\]

This made it possible to capture, with precision, the idea that aninference is valid in the predicate calculus iff: every interpretationthat makes the premises true also makes the conclusion true, holdingfixed the interpretations of logical elements like ‘if’and ‘every’. Davidson (1967a) conjectured that one coulddo for English what Tarski did for the predicate calculus; andMontague, similarly inspired by Tarski, showed how one could startdealing with predicates that have quantificationalconstituents. Still, many apparent objections to the conjectureremained. As noted at the end ofsection four, sentences like ‘Pat thinks that Hesperus is Phosphorus’present difficulties; though Davidson (1968) offered an influentialsuggestion. Davidson’s (1967b) proposal concerning examples like(37–40) also proved enormously fruitful.

(37)
Juliet kissed Romeo quickly at midnight.
(38)
Juliet kissed Romeo quickly.
(39)
Juliet kissed Romeo at midnight.
(40)
Juliet kissed Romeo.

If (37) is true, so are (38–40); and if (38) or (39) is true, sois (40). The inferences seem impeccable. But the function-argumentstructures are not obvious. If we represent ‘kissed quickly atmidnight’ as an unstructured predicate that takes two arguments,like ‘kissed’ or ‘kicked’, we will representthe inference from (37) to (40) as having the form: \(K^*(x, y)\); so \(K(x, y)\).But this form is exemplified by the bad inference ‘Julietkicked Romeo; so Juliet kissed Romeo’. Put another way, if‘kissed quickly at midnight’ is a logically unstructuredbinary predicate, then the following conditional is a nonlogicalassumption: if Juliet kissed Romeo in a certain manner at a certaintime, then Juliet kissed Romeo. But this conditional seems like atautology, not an assumption that introduces any epistemic risk.Davidson concluded that the surface appearances of sentences like(37–40) mask relevant semantic structure. In particular, heproposed that such sentences are understood in terms of quantificationover events.

According to Davidson, who echoed Ramsey (1927), the meaning of (40)is reflected in the paraphrase ‘There was a kissing of Romeo byJuliet’. One can formalize this proposal in various ways, withdifferent implications for how verbs like ‘kiss’ arerelated to propositional constituents: \(\exists e [\textrm{Past}(e)\land \textrm{KissingOf}(e, \textrm{Romeo}) \land\textrm{KissingBy}(e, \textrm{Juliet})]\); or \(\exists e[\textrm{Past}(e) \land \textrm{KissingByOf}(e, \textrm{Juliet},\textrm{Romeo})]\); or as in (40a), with Juliet and Romeo explicitlyrepresented as players of certain roles in an event.

(40a)
\(\exists e [\textrm{Agent}(e, \textrm{Juliet}) \land \textrm{Kissing}(e) \land \textrm{Patient}(e, \textrm{Romeo})]\)

But given any such representation, adverbs like ‘quickly’and ‘at midnight’ can be analyzed as additional predicatesof events, as shown in (37a–39a).

(37a)
\(\exists e [\textrm{Agent}(e, \textrm{Juliet}) \land \textrm{Kissing}(e) \land \textrm{Patient}(e, \textrm{Romeo})\:\land\) \(\textrm{Quick}(e) \land \textrm{At-midnight}(e)]\)
(38a)
\(\exists e [\textrm{Agent}(e, \textrm{Juliet}) \land \textrm{Kissing}(e) \land \textrm{Patient}(e, \textrm{Romeo})\:\land\) \(\textrm{Quick}(e)]\)
(39a)
\(\exists e [\textrm{Agent}(e, \textrm{Juliet}) \land \textrm{Kissing}(e) \land \textrm{Patient}(e, \textrm{Romeo})\:\land\) \(\textrm{At-midnight}(e)]\)

If this is correct, then the inference from (37) to (40) is aninstance of the following valid form: \(\exists e [\ldots e \ldots\land Q(e) \land A(e)]\); hence, \(\exists e [\dots e \dots]\). Theother impeccable inferences involving (37–40) can likewise beviewed as instances of conjunction reduction in the scope of anexistential quantifier; see Pietroski (2018) for discussion thatconnects this point to medieval insights noted insection three. If the grammatical form of (40) is simply ‘{Juliet [kissedRomeo]}’, then the mapping from grammatical to logical form isnot transparent; and natural language is misleading, in that no wordcorresponds to the event quantifier. But this does not posit asignificant structural mismatch between grammatical and logical form.On the contrary, each word in (40) corresponds to a conjunct in (40a).This suggests a strategy for thinking about how the meaning of asentence like (40) might be composed from the meanings of theconstituent words. A growing body of literature, in philosophy andlinguistics, suggests that Davidson’s proposal captures animportant feature of natural language semantics, and that “eventanalyses” provide a useful framework for discussions of logicalform; see, e.g., Schein (2017) for extended discussion and manyreferences.

In one sense, it is an ancient idea that action reports like (40)represent individuals as participating in events; see Gillon’s(2007) discussion of Pāṇini’s grammar of Sanskrit.But if (40) can be glossed as ‘Juliet did some kissing, andRomeo was thereby kissed’, perhaps the ancient idea can bedeployed in developing Leibniz’ suggestion that relationalsentences like (40) somehow contain simpler active-voice andpassive-voice sentences; cp. Kratzer (1996). And perhaps appeals toquantifier raising can help in defending the idea that ‘Julietkissedsome/the/every boy’ is, after all, a sentencethat exhibits Subject-copula-Predicate form:‘[some/the/every boy]i isP’, with ‘P’ as a complexpredicate akin to ‘[some event]e was both akissing done by Juliet and one in which hei waskissed’.

With this in mind, let’s return to the idea that each complexexpression of natural language has semantic properties that aredetermined by (i) the semantic properties of its constituents, and(ii) the ways in which these constituents are grammatically arranged.If this is correct, then following Davidson, one might say that thelogical forms of expressions (of some natural language) just are thestructures that determine the corresponding meanings given therelevant word meanings; see Lepore and Ludwig (2002). In which case,the phenomenon of valid inference may be largely a by-product ofsemantic compositionality. If principles governing the meanings of(37–40) have the consequence that (40) is true iff anexistential claim like (40a) is true, perhaps this is illustrative ofthe general case. Given a sentence of some natural language NL, thetask of specifying its logical form may be inseparable from the taskof providing a compositional specification of what the sentences of NLmean.

10. Further Questions

At this point, many issues become relevant to further discussions oflogical form. Most obviously, there are questions concerningparticular examples. Given just about any sentence of naturallanguage, one can ask interesting questions (that remain unsettled)about its logical form. There are also very abstract questions aboutthe relation of semantics to logic. Should we follow Davidson andMontague, among others, in characterizing theories of meaning fornatural languages as theories of truth (that perhaps satisfy certainconditions on learnability)? Is an algorithm that correctly associatessentences with truth-conditions (relative to contexts) necessaryand/or sufficient for being an adequate theory of meaning? What shouldwe say about the paradoxes apparently engendered by sentences like‘This sentence is false’? If we allow for second-orderlogical forms, how should we understand second-order quantification,given Russell’s Paradox? Are claims about the “semanticstructure” of a sentence fundamentally descriptive claims aboutspeakers (or their communities, or their languages)? Or is there animportant sense in which claims about semantic structure are normativeclaims about how we should use language? Are facts about theacquisition of language germane to hypotheses about logical form? Andof course, the history of the subject reveals that the answers to thecentral questions are by no means obvious: what is logical structure,what is grammatical structure, and how are they related? Or putanother way, what kinds of structures do propositions and sentencesexhibit, and how do thinkers/speakers relate them?

Bibliography

Cited Works

  • Barwise, J. & Cooper, R., 1981, “Generalized Quantifiersand Natural Language”,Linguistics and Philosophy, 4:159–219.
  • Beaney, M., ed., 1997,The Frege Reader, Oxford:Blackwell.
  • Berwick, B. et al., 2011, “Poverty of the StimulusRevisited”,Cognitive Science, 35: 1207–42.
  • Boolos, G., 1998,Logic, Logic, and Logic, Cambridge, MA:Harvard University Press.
  • Carnap, R., 1950, “Empiricism, Semantics, andOntology”, reprinted in R. Carnap,Meaning andNecessity; second edition, Chicago: University of Chicago Press,1956.
  • Cartwright, R., 1962, “Propositions”, in R. J. Butler,Analytical Philosophy, 1st series, Oxford: Basil Blackwell1962; reprinted with addenda in Richard Cartwright,PhilosophicalEssays, Cambridge, MA: MIT Press 1987.
  • Chomsky, N., 1957,Syntactic Structures, The Hague:Mouton.
  • –––, 1964,Current Issues in LinguisticTheory, The Hague: Mouton.
  • –––, 1965,Aspects of the Theory ofSyntax, Cambridge, MA: MIT Press.
  • –––, 1981,Lectures on Government andBinding, Dordrecht: Foris.
  • –––, 1986,Knowledge of Language, NewYork: Praeger.
  • –––, 1995,The Minimalist Program,Cambridge, MA: MIT Press.
  • Davidson, D., 1967a, “Truth and Meaning”,Synthese, 17: 304–23.
  • –––, 1967b, “The Logical Form of ActionSentences”, in N. Rescher (ed.),The Logic of Decision andAction, Pittsburgh: University of Pittsburgh Press.
  • –––, 1968, “On Saying That”,Synthese, 19: 130–46.
  • –––, 1980,Essays on Actions andEvents, Oxford: Oxford University Press.
  • –––, 1984,Inquiries into Truth andInterpretation, Oxford: Oxford University Press.
  • Donnellan, K., 1966, “Reference and DefiniteDescriptions”,Philosophical Review, 75:281–304.
  • Fodor, J., 1978, “Propositional Attitudes”,TheMonist, 61: 501–23.
  • Frege, G., 1879,Begriffsschrift, reprinted in Beaney1997.
  • –––, 1884,Die Grundlagen derArithmetik, Breslau: Wilhelm Koebner. English translation,The Foundations of Arithmetic, J. L. Austin (trans). Oxford:Basil Blackwell, 1974.
  • –––, 1891, “Function and Concept”,reprinted in Beaney 1997.
  • –––, 1892, “On Sinn and Bedeutung”,reprinted in Beaney 1997.
  • Gillon, B., 2007, “Pāṇini’sAṣṭādhyāyī and Linguistic Theory”,Journal of Indian Philosophy, 35: 445–468.
  • Harman, G., 1972, “Logical Form”,Foundations ofLanguage, 9: 38–65.
  • –––, 1973,Thought, Princeton: PrincetonUniversity Press.
  • Higginbotham, J., 1986, “Linguistic Theory andDavidson’s Program in Semantics”, in E. Lepore (ed.),Truth and Interpretation, pp. 29–48, Oxford:Blackwell.
  • Higginbotham, J. & May, R., 1981, “Questions,Quantifiers, and Crossing”,Linguistic Review, 1:47–79.
  • Hornstein, N., 1995,Logical Form: From GB to Minimalism,Oxford: Blackwell.
  • Huang, J., 1995, “Logical Form”, in G. Webelhuth(ed.),Government and Binding Theory and the Minimalist Program:Principles and Parameters in Syntactic Theory, pp. 127–175,Oxford: Blackwell.
  • Iacona, A., 2018,Logical Form: Between Logic and NaturalLanguage, Berlin: Springer.
  • Jacobson, P., 1999, “Variable Free Semantics”,Linguistics and Philosophy, 22: 117–84.
  • King, J., 2002, “Two Sorts of Claims about Logical Form” in Preyer and Peter 2002.
  • –––,The Nature and Structure ofContent, Oxford: Oxford University Press.
  • Keenan, E., 1996, “The Semantics of Determiners”, inS. Lappin (ed.),The Handbook of Contemporary SemanticTheory, Oxford: Blackwell, pp. 41–63.
  • Kratzer, A., 1986, “Severing the External Argument from itsVerb”, in J. Rooryck and L. Zaring (eds.),Phrase Structureand the Lexicon, Dordrecht: Kluwer, pp. 109–137.
  • Larson, R. and Ludlow, P., 1993, “Interpreted LogicalForms”,Synthese, 95: 305–55.
  • Lepore, E. and Ludwig, K., 2002, “What is LogicalForm?”, in Preyer and Peter 2002, pp. 54–90.
  • Ludlow, P., 2002, “LF and Natural Logic”, in Preyerand Peter 2002, pp. 132–168.
  • May, R., 1985,Logical Form: Its Structure andDerivation, Cambridge, MA: MIT Press.
  • Montague, R., 1970, “English as a Formal Language”, inR. Thomason (ed.),Formal Philosophy, NewHaven, CT: Yale University Press, 1974, pp. 7–27.
  • Parsons, T., 2014,Articulating Medieval Logic, Oxford:Oxford University Press.
  • Pietroski, P., 2018,Conjoining Meanings, Oxford: OxfordUniversity Press.
  • Preyer, G. and Peter, G. (eds.), 2002,Logical Form andLanguage, Oxford: Oxford University Press.
  • Quine, W.V.O., 1950,Methods of Logic, New York: HenryHolt.
  • –––, 1951, “Two Dogmas ofEmpiricism”,Philosophical Review, 60:20–43.
  • –––, 1953, “On What There Is”, inFrom a Logical Point of View, Cambridge, MA:Harvard University Press, pp. 1–19.
  • –––, 1960,Word and Object, CambridgeMA: MIT Press.
  • –––, 1970,Philosophy of Logic,Englewood Cliffs, NJ: Prentice Hall.
  • Ramsey, F., 1927, “Facts and Propositions”,Proceedings of the Aristotelian Society (SupplementaryVolume), 7: 153–170.
  • Sànchez, V., 1991,Studies on Natural Logic andCategorial Grammar, Ph.D. Thesis, University of Amsterdam.
  • —1994, “Montonicity in Medieval Logic”,Language and Cognition, 4: 161–74.
  • Schein, B., 1993,Events and Plurals, Cambridge, MA: MITPress.
  • –––, 2017,And: Conjunction ReductionRedux, Cambridge, MA: MIT Press.
  • Segal, G., 1989, “A Preference for Sense andReference”,The Journal of Philosophy, 86:73–89.
  • Soames, S., 1987, “Direct Reference, PropositionalAttitudes, and Semantic Content”,Philosophical Topics,15: 47–87.
  • –––, 1995, “Beyond SingularPropositions”,Canadian Journal of Philosophy, 25:515–50.
  • –––, 2002,Beyond Rigidity, Oxford:Oxford University Press.
  • Sommers, F., 1984,The Logic of Natural Language, Oxford:Oxford University Press.
  • Stanley, J., 2000, “Context and Logical Form”,Linguistics and Philosophy, 23: 391–434.
  • Strawson, P., 1950, “On Referring”,Mind, 59:320–44.
  • Tarski, A., 1933, “The Concept of Truth in FormalizedLanguages”, reprinted in Tarski 1983.
  • –––, 1944, “The Semantic Conception ofTruth”,Philosophy and Phenomenological Research, 4:341–75.
  • –––, 1983,Logic, Semantics,Metamathematics, J. Corcoran (ed.), J.H. Woodger (trans.), 2ndedition, Indianapolis: Hackett.
  • van Benthem, J., 1986,Essays in Logical Semantics,Dordrecht: D. Reidel.
  • Wiggins, D., 1980, “‘Most’ and‘all’: some comments on a familiar programme, and on thclogical form of quantified sentences”, in M. Platts (ed.)Reference, truth and reality: Essays on the philosophy oflanguage, London: Routledge & KeganPaul, pp. 318–346.
  • Wittgenstein, L., 1921,Tractatus Logico-Philosophicus, D.Pears and B. McGuinness (trans.), London: Routledge & KeganPaul.
  • –––, 1953.Philosophical Investigations,New York: Macmillan.

Some Other Useful Works

A few helpful overviews of the history and basic subject matter oflogic:

  • Kneale, W. & Kneale, M., 1962,The Development ofLogic, Oxford: Oxford University Press; reprinted 1984.
  • Sainsbury, M., 1991,Logical Forms, Oxford:Blackwell.
  • Broadie, A., 1987,Introduction to Medieval Logic,Oxford: Oxford University Press.
  • For these purposes, Russell’s most important books are:Introduction to Mathematical Philosophy, London: George Allenand Unwin, 1919;Our Knowledge of the External World, NewYork: Norton, 1929; andThe Philosophy of Logical Atomism, LaSalle, Ill: Open Court, 1985. Stephen Neale’sbookDescriptions (Cambridge, MA: MIT Press, 1990) is arecent development of Russell’s theory.

For introductions to Transformational Grammar and Chomsky’sconception of natural language:

  • Radford, A., 1988,Transformational Grammar, Cambridge:Cambridge University Press.
  • Haegeman, L., 1994,Introduction to Government & BindingTheory, Oxford: Blackwell.
  • Lasnik, H. (with M. Depiante and A. Stepanov), 2000,SyntacticStructures Revisited, Cambridge, MA: MIT Press.

For discussions of work in linguistics bearing directly on issues oflogical form:

  • Higginbotham, J., 1985, “On Semantics”,LinguisticInquiry, 16: 547–93.
  • Hornstein, N., 1995,Logical Form: From GB to Minimalism,Oxford: Blackwell.
  • Larson, R. and Segal, G., 1995,Knowledge of Meaning,Cambridge, MA: MIT Press.
  • May, R., 1985,Logical Form: Its Structure andDerivation, Cambridge, MA: MIT Press.
  • Neale, S., 1993,Grammatical Form, Logical Form, andIncomplete Symbols, in A. Irvine & G. Wedeking (eds.),Russell and Analytic Philosophy, Toronto:University of Toronto, pp. 97–139.

For discussions of the Davidsonian program (briefly described insection nine) and appeal to events:

  • Davidson, D., 1984,Essays on Truth and Interpretation,Oxford: OUP.
  • –––, 1985, “Adverbs of Action”, inB. Vermazen and M. Hintikka (eds.),Essays on Davidson: Actionsand Events, Oxford: Clarendon Press, pp. 230–241.
  • Evans, G. & McDowell, J. (eds.), 1976,Truth andMeaning, Oxford: Oxford University Press.
  • Higginbotham, J., Pianesi, F. and Varzi, A. (eds.), 2000,Speaking of Events, Oxford: Oxford University Press.
  • Ludwig, K. (ed.), 2003,Contemporary Philosophers in Focus:Donald Davidson, Cambridge: Cambridge University Pres
  • Lycan, W., 1984,Logical Form in Natural Language,Cambridge, MA: MIT Press.
  • Parsons, T., 1990,Events in the Semantics of EnglishCambridge, MA: MIT Press.
  • Pietroski, P., 2005,Events and Semantic Architecture,Oxford: Oxford University Press.
  • Taylor, B., 1985,Modes of Occurrence, Oxford:Blackwell.

Acknowledgments

The author would like to thank: Christopher Menzel for spotting anerror in an earlier characterization of the generalized quantifier‘every’, prompting revision of the surrounding discussion;Karen Carter, Max Heiber, Claus Schlaberg, and David Korfmacher forcatching various typos in previous versions; and for comments on theintial versions, Susan Dwyer, James Lesher, the editors andreferees.

Copyright © 2021 by
Paul Pietroski<paul.pietroski@rutgers.edu>

Open access to the SEP is made possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support
Please Read How You Can Help Keep the Encyclopedia Free

Browse

About

Support SEP

Mirror Sites

View this site from another server:

USA (Main Site)Philosophy, Stanford University

The Stanford Encyclopedia of Philosophy iscopyright © 2023 byThe Metaphysics Research Lab, Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054


[8]ページ先頭

©2009-2025 Movatter.jp