Movatterモバイル変換


[0]ホーム

URL:


SEP logo
Stanford Encyclopedia of Philosophy Archive
Summer 2016 Edition

Logical Form

First published Tue Oct 19, 1999; substantive revision Mon Nov 30, 2015

Some inferences are impeccable. Examples like (1–3)illustrate reasoning that cannot lead from true premises to falseconclusions.

(1)John danced if Mary sang, and Mary sang; soJohndanced.
(2)Every politician is deceitful, and everysenator is apolitician; so every senator is deceitful.
(3)The detective is in the garden; so someone isin thegarden.

In such cases, a thinker takes no epistemic risk by endorsing theconditional claim that the conclusion is trueif thepremises are true. The conclusion follows from the premises, withoutany further assumptions that might turn out to be false. Any risk oferror lies entirely with the premises, as opposed to the reasoning. Bycontrast, examples like (4–6) illustrate reasoning that involvesat least some risk of going wrong—from correct premises to amistaken conclusion.

(4)John danced if Mary sang, and John danced; soMarysang.
(5)Every feathered biped is a bird, and Tweety isafeathered biped; so Tweety can fly.
(6)Every human born before 1879 died; so everyhuman willdie.

Inference (4) is not secure. John might dance whenever Mary sings,but also sometimes when Mary doesn't sing. Similarly, with regard to(5), Tweety might turn out to be a bird that cannot fly. Even (6)falls short of the demonstrative character exhibited by (1–3).While laws of nature may preclude immortality, the conclusion of (6)goes beyond its premise, even if it is foolish to resist theinference.

Appeals to logical form arose in the context of attempts to say moreabout this intuitive distinction between impeccable inferences, whichinvite metaphors of security, and inferences that involve some risk ofslipping from truth to falsity. The idea is that some inferences, like(1-3), arestructured in a way that confines any risk oferror to the premises. The motivations for developing this idea wereboth practical and theoretical. Experience teaches us that aninference can initially seem more secure than it is; and if we knewwhichforms of inference are risk-free, that might help usavoid errors. As we'll see, claims about inference are also intimatelyconnected with claims about the nature of thought and its relation tolanguage.

Many philosophers have been especially interested in thepossibility that grammarmasks the underlying structure ofthought, perhaps in ways that invite mistaken views about how ordinarylanguage is related to cognition and the world we talk about. Forexample, similarities across sentences like ‘Odysseusarrived’, ‘Nobody arrived’, and ‘The kingarrived’ initially suggest that the corresponding thoughtsexhibit a common subject-predicate form. But even if‘Odysseus’ indicates an entity that can be the subject ofa thought that is true if and only if the entity in question arrived,other considerations suggest that ‘Nobody’ and ‘Theking’ do not indicate subjects of thoughts in this sense. Thisraises further questions about inference—e.g., why ‘Theking arrived’ implies an arrival, while ‘Nobodyarrived’ does not—and more general questions about howlogic is related to grammar. Do thoughts and sentences exhibitdifferent kinds of structure? Do sentences exhibit grammaticalstructures that are not obvious? And if the logical structure of athought can diverge from the grammatical structure of a sentence thatis used to express the thought, how should we construe proposals aboutthe logical forms of inferences like (1-6)? Are such proposalsnormative claims about how we ought to think/talk, or empiricalhypotheses about aspects of psychological/linguistic reality?

Proposed answers to these questions are usually interwoven withclaims about why various inferences seem compelling. So it would benice to know which inferences really are secure, and in virtue of whatthese inferences are special.  The most common suggestion hasbeen that certain inferences are secure by virtue of their logicalform. Though unsurprisingly, conceptions of form have evolved alongwith conceptions of logic and language.

1. Patterns of Reason

One ancient idea is that impeccable inferences exhibit patternsthat can be characterized schematically by abstracting away from thespecific contents of particular premises and conclusions, therebyrevealing a general form common to many other impeccable inferences.Such forms, along with the inferences that exemplify them, are said tobe valid.

Given a valid inference, there is a sense in which the premisescontain the conclusion, which is correspondingly extractable from thepremises. With regard to (1) and (7),

(1)John danced if Mary sang, and Mary sang; soJohndanced.
(7)Chris swam if Pat was asleep, and Pat wasasleep; soChris swam

it seems especially clear that the conclusion is part of the firstpremise, and that the second premise is another part of the first. Wecan express this point by saying that these inferences are instancesof the following form:B ifA, andA; soB. The Stoics discussedseveral patterns of this kind, using ordinal numbers (instead ofletters) to capture abstract forms like the ones shownbelow. 

Ifthe first thenthesecond, andthe first; sothesecond.

Ifthe first thenthe second, butnotthe second; so notthefirst.

Eitherthe first orthe second, butnotthe second; sothe first.

Not boththe first andthesecond, butthe first; so notthesecond.

These schematic formulations require variables. And let usintroduce ‘proposition’ as a term of art for whatever thevariables above, indicated in bold, range over. Propositions arepotential premises/conclusions. They can be endorsed or rejected, andthey exhibit containment relations of some kind. So presumably,propositions are abstract things that can be evaluated for truth orfalsity. This leaves it open what propositions are: sentences,statements, states of affairs, or whatever. But let's assume thatdeclarative sentences can be used to express propositions. (Fordiscussion, see Cartwright (1962) and the essay onstructured propositions.)

A significant complication is that in ordinary conversation, thecontext matters with regard to which proposition is expressed with agiven sentence. For example, ‘Pat is asleep’ can be usedat one time to express a true premise, and at another time to expressa false premise. A given speaker might use ‘I am tired’ toexpress a false proposition, while another speaker uses the samesentence at the same time to express a true proposition. What countsas being tired can also vary across conversations. Contextsensitivity, of various kinds, is ubiquitous in ordinary discourse.Moreover, even given a context, a sentence like ‘He isbald’ may not express a unique proposition. (There may be noreferent for the pronoun; and even if there is, thevagueness of ‘bald’ may yield arange of candidate propositions, with no fact of the matter as towhich one isthe proposition expressed.) Nonetheless, we canoften use sentences like ‘Every circle is an ellipse’ and‘Thirteen is a prime number’ to express premises of validarguments. To be sure, ordinary conversation differs from theoreticaldiscourse in mathematics. But the distinction between impeccable andrisky inferences is not limited to special contexts in which we try tothink especially clearly about especially abstract matters. So whenfocusing on the phenomenon of valid inference, we can try to simplifythe initial discussion by abstracting away from the contextsensitivity of language use. 

Another complication is that in speaking of an inference, one might betalking about (i) aprocess in which a thinker draws aconclusion from some premises, or (ii) somepropositions, oneof which is designated as an alleged consequence of the others; see,e.g., Harman (1973). But we can describe a risky thought process asone in which a thinker who accepts certain propositions—perhapstentatively or hypothetically—comes to accept, on that basis, aproposition that does not follow from the initial premises. And itwill be simpler to focus on premises/conclusions, as opposed toepisodes of reasoning.

With regard to (1), the inference seems secure in partbecause its first premise has the form‘B ifA’.

(1)John danced if Mary sang, and Mary sang; soJohndanced.

If the first premise didn't have this form, the inference wouldn'tbe an instance of ‘B ifA, andA; soB’. It isn't obviousthatall impeccable inferences are instances of a moregeneral valid form, much less inferences whose impeccability is due tothe forms of the relevant propositions. But this thought has served asan ideal for the study of valid inference, at least since Aristotle'streatment of examples like (2).

(2)Every senator is a politician, and everypolitician is deceitful; so every senator is deceitful.

Again, the first premise seems to have several parts, each ofwhich is a part of the second premise or the conclusion. (In English,the indefinite article in ‘Every senator is apolitician’ cannot be omitted; likewise for ‘Everypolitician is a liar’. But at least for now, let's assume thatin examples like these, ‘a’ does not itself indicate apropositional constituent.) Aristotle, predating the Stoics, notedthat conditional claims like the following are sure to be true: if(the property of) being a politician belongs to every senator, andbeing deceitful belongs to every politician, then being deceitfulbelongs to every senator. Correspondingly, the inference patternbelow is valid.

EveryS isP, and everyPisD;so everyS isD

And inference (2) seems to be valid because its parts exhibit thispattern. Aristotle discussed many such forms of inference, calledsyllogisms, involving propositions that can be expressed withquantificational words like ‘every’ and‘some’. For example, the syllogistic patterns below arealso valid.

EveryS isP, and someS isD; so someP isD.

SomeS isP, and everyPisD; so some S isD.

SomeS is notP, everyD isP;so someS is notD.

We can rewrite the last two, so that each of the valid syllogismsabove is represented as having a first premise of the form‘EveryS isP’.

EveryS isP, and someD isS; so someD isP.

EveryS isP, and someD is notP; so someD is notS.

But however the inferences are represented, the important point isthat the variables—represented here in italics—range overcertainparts of propositions. Intuitively, common nouns like‘politician’ and adjectives like ‘deceitful’are general terms, since they can apply to more than oneindividual. And many propositions apparently contain correspondinglygeneral elements. For example, the proposition that every senator isdeceitful contains two such elements, both relevant to the validity ofinferences involving this proposition.

Propositions thus seem to have structure that bears on the validityof inferences, even ignoring premises/conclusions with propositionalparts. That is, even simple propositions have logical form. And asAristotle noted, pairs of such propositions can be related ininteresting ways. If everyS isP, then someS isP. (For these purposes, assume there is atleast oneS.) If noS isP, then someS is notP. It is certain that either everyS isP or someS is notP; andwhichever of these propositions is true, the other isfalse. Similarly, the following propositions cannot both be true:everyS isP; and noS isP. Butit isn't certain that either everyS isP, or noS isP. Perhaps someS isP, andsomeS is notP. This network of logical relationsstrongly suggests that the propositions in question contain aquantificational element and two general elements—and in somecases, an element of negation. This raises the question of whetherother propositions have a similar structure.

  2. Propositions and Traditional Grammar

Consider the proposition that Vega is a star, which can figure ininferences like (8).

(8)Every star is purple, and Vega is a star; soVega is purple.

Aristotle's logic focused on quantificational propositions; and as weshall see, this was prescient. But on his view, propositions like theconclusion of (8) still exemplify a subject-predicate structure thatis shared by at least many of the sentences we used to expresspropositions. And one can easily formulate the schema ‘everyS isP, andn isS; sonisP’, where the new lower-case variable is intended torange over proposition-parts of the sort indicated by names. (On someviews, discussed below, a name like ‘Vega’ is a complexquantificational expression; though unsurprisingly, such views aretendentious.)

Typically, a declarative sentence can be divided into a subjectand a predicate: ‘Every star / is purple’, ‘Vega /is a star’, ‘Some politician / lied’, ‘Thebrightest planet / is visible tonight’, etc. Until quiterecently, it was widely held that this grammatical division reflects acorresponding kind of logical structure: the subject of a proposition(i.e., what the proposition is about) is a target for predication. Onthis view, both ‘Every star’ and ‘Vega’indicate subjects of propositions in (8), while ‘is’introduces predicates. Aristotle would have said that in the premisesof (8), being purple is predicated of every star, and being a star ispredicated of Vega. Later theorists emphasized the contrast betweengeneral terms like ‘star’ and singular terms like‘Vega’, while also distinguishing terms fromsyncategorematic expressions (e.g., ‘every’ and‘is’) that can combine with terms to form complex subjectsand predicates, including ‘will lie’, ‘canlie’, and ‘may have lied’. But despite thecomplications, it seemed clear that many propositions have thefollowing canonical form: Subject-copula-Predicate; where a copulalinks a subject, which may consist of a quantifier and a general term,to a general term. Sentences like ‘Every star twinkles’can be paraphrased with sentences like ‘Every star is a thingthat does some twinkling’. This invites the suggestion that‘twinkles’ is somehow an abbreviation for ‘is athing that does some twinkling’, perhaps in the way that‘bachelor’ is arguably short for ‘unmarriedmarriageable man’. 

The proposition that not only Vega twinkles, which seems to containthe proposition that Vega twinkles, presumably includes elements thatare indicated with ‘only’ and ‘not’. Suchexamples invite the hypothesis that all propositions are composed ofterms along with a relatively small number of syncategorematicelements, and that complex propositions can be reduced to canonicalpropositions that are governed by Aristotelian logic. This is not tosay that all propositions were, or could be, successfully analyzed inthis manner. But via this strategy, medieval logicians were able todescribe many impeccable infererences as instances of valid forms. Andthis informed their discussions of how logic is related togrammar.

Many viewed their project as an attempt to uncover principles of amental language common to all thinkers. Aristotle had said, similarly,that spoken sounds symbolize “affections of the soul.”From this perspective, one expects a few differences betweenpropositions and overt sentences. If ‘Every star twinkles’expresses a proposition that contains a copula, then spoken languagesmask certain aspects of logical structure. Ockham also held that amental language would have no need for Latin's declensions, and thatlogicians could ignore such aspects of spoken language. The ancientGreeks were aware of sophisms like the following: that dog is afather, and that dog is yours; so that dog is your father. This badinference cannot share its form with the superficially parallel butimpeccable variant: that dog is a mutt, and that mutt is yours; sothat dog is your mutt. (See Plato, Euthydemus 298 d-e.) So thesuperficial features of sentences are not infallible guides to thelogical forms of propositions. Still, the divergence was held to berelatively minor. Spoken sentences have structure; they are composed,in systematic ways, of words. And the assumption was that sentencesreflect the major aspects of propositional form, including asubject-predicate division. So while there is a distinction betweenthe study of valid inference and the study of sentences used in spokenlanguage, the connection between logic and grammar was thought to rundeep. This suggested that the logical form of a proposition just isthe grammatical form of some (perhaps mental) sentence.

3. Motivations for Revision

Towards the end of the eighteenth century, Kant could say (withoutmuch exaggeration) that logic had followed a single path since itsinception, and that “since Aristotle it has not had to retrace asingle step.” He also said that syllogistic logic was “toall appearance complete and perfect.” But this was exuberance.Indeed, some of the real successes highlighted known problems.

Some valid schemata are reducible to others, in that any inferenceof the reducible form can be revealed as valid (with a little work)given other schemata. Consider (9).

(9)If Al ran then either Al did not run or Bobdid notswim, and Al ran; so Bob did not swim.

Assume that ‘Al did not run’ negates ‘Al ran’,while ‘Bob did not swim’ negates ‘Bobswam’. Then (9) is an instance of the following valid form: ifA then either not-A ornot-B, andA; sonot-B. But we can treat this as a derived form, byshowing that any instance of this form is valid given two (intuitivelymore basic) Stoic inference forms: ifthe firstthenthe second, and the first, sothe second; either not the firstornotthe second, andthe first; sonotthe second. For suppose we are given thefollowing premises:A; and ifA,then either not-A or not-B. We cansafely infer that either not-A ornot-B; and since we were given thatA, we can safely infer that not-B. Similarly, thesyllogistic schema (10) can be treated as a derived form.

(10)SomeS is notP, and everyD isP; so not everyS isD.

If someS is notP, and everyD isP, then it isn't true that everyS isD. For if everyS isD, and everyD isP, then every S isP. ButifsomeS is notP, then as we saw above,not everyS isP. So given the premises of (10),adding ‘every S isD’ would lead tocontradiction: every S isP, and not everyS isP. So the premises imply thenegation of‘every S isD’. This reasoning shows how(10) can be reduced to inferential patterns that seem morebasic—raising the question of how much reduction is possible.Euclid's geometry had provided a model for how to present a body ofknowledge as a network of propositions that follow from a few basicaxioms. Aristotle himself indicated how to reduce all the validsyllogistic schemata to four basic patterns, given a few principlesthat govern how the basic patterns can be used to derive others; seeParsons (2014) for discussion. And further reduction is possible giveninsights from the medieval period.

Consider the following pair of valid inferences: Fido is a browndog, so Fido is a dog; Fido is not a dog, so Fido is not a brown dog.As illustrated with the first example, replacing a predicate (orgeneral term) like ‘brown dog’ with alessrestrictive predicate like ‘dog’ is often valid. Butsometimes—paradigmatically, in cases involvingnegation—replacing a predicate like ‘dog’ with amore restrictive predicate like ‘brown dog’ isvalid. Plausibly, the first pattern reflects the default direction ofvalid replacement: removing a restriction preserves truth, except inspecial cases like those involving negation. Suppose we take it asgiven that poodles are dogs of a particular sort, and hence that everypoodle is a dog. Then replacing‘poodle’ with ‘dog'in ‘Fido isP’ is valid, regardless of what‘Fido’ names. This can be viewed as a special case of‘n isP, and everyP isD;son isD’. But the validity of this inferenceform can also be viewed as symptom of a basic principle that came becalleddictum de omni: whatever is true of everyPis true of anyP. Or as Aristotle might have put it, if theproperty of being a dog belongs to every poodle, then it belongs toany poodle. In which case, Fido is a dog if Fido is a poodle. Andsince the property of being a dog surely belongs to every brown dog,any brown dog is a dog. The flip side of this point is that negationinverts the default direction of inference. Anything that isn't a dogisn't a brown dog; and similarly, if Fido isn't a dog, Fido isn't apoodle. So in special cases, adding a restriction to a general termlike ‘dog’ can preserve truth.

From this perspective, the Aristotelian quantifier‘Some’ is a default-style quantifier that validatesremoving restrictions. If some brown dog is a clever mutt, itfollows that some dog is a clever mutt, and hence that some dog is amutt. By contrast, ‘No’ is an inverted-style quantifierthat validatesaddingrestrictions. If no dog is a mutt, itfollows that no dog is a clever mutt, and hence that no brown dog is aclever mutt. The corresponding principle,dictum de nullo,encodes this pattern: whatever is true of noP is not true ofanyP; so if the property of being a mutt belongs to no dog,it belongs to no poodle. (And as Aristotle noted, instances of‘NoS isP’ can be analyzed as thepropositional negations of corresponding instances of ‘SomeS isn'tP’.

Interestingly, ‘Every’ is like ‘No’ in onerespect, and like ‘Some’ in another respect. If every dogis clever, it follows that every brown dog is clever; but if every dogis a clever mutt, it follows that every dog is a mutt. So when theuniversal quantifier combines with a general termS to form asubject,S is governed by theinverted rule ofreplacement. But when a universally quantified subject combines witha second general term to form a proposition, this second term isgoverned by thedefaultrule of replacement. Given that‘Every’ has this mixed logical character, the validsyllogisms can be derived from two basic patterns (noted above), bothof which reflectdictum de omni: whatever is true of everyP is true of anyP.

EveryS isP, and everyP isD; so everyS isD.

EveryS isP, and someD isS; sosomeD isP.

The first principle reflects the sense in which universalquantification is transitive. The second principle captures the ideathat a universal premise can licence replacement of‘S’ with ‘P’ in a premiseabout a specific individual. In this sense, classical logic exhibits astriking unity and simplicity, at least with regard to inferencesinvolving the Aristotelian quantifiers and predication; see Sommers(1984) and Ludlow (2005), drawing on Sanchez (1991), for furtherdiscussion.

Alas, matters become more complicated once we considerrelations.

Sentences like ‘Juliet kissed Romeo’ do not seem tohave Subject-copula-Predicate form. One might suggest ‘Julietwas a kisser of Romeo’ as a paraphrase. But ‘kisser ofRomeo’ differs, in ways that matter to inference, from generalterms like ‘politician’. If Juliet (or anyone) was akisser of Romeo, it follows that someone was kissed; whereas if Julietwas a politician, there is no corresponding logical consequence to theeffect that someone was __-ed. Put another way, the proposition thatJuliet kissed someone exhibits interesting logical structure, even ifwe can express this proposition via the sentence ‘Juliet was akisser of someone’. A quantifier can be part of a complexpredicate. But classical logic did not capture the validity ofinferences involving predicates that have quantificationalconstituents. Consider (11).

(11)Some patient respects every doctor, and some doctor is a liar; so
some patient respects some liar.

If ‘respects every doctor’ and ‘respects someliar’ indicate nonrelational proposition-parts, much like‘is sick’ or ‘is happy’, then inference (11)has the following form ‘SomeP isS, and someD isL; so someP isH’. Butthis schema, which fails to reflect the quantificational structurewithin the predicates is not valid. Its instances include badinferences like the following: some patient is sick, and some doctoris a liar; so some patient is happy. This dramatizes the point that‘respects every doctor’ and ‘respects someliar’ are—unlike ‘is sick’ and ‘istall’—logically related in a way that matters given themiddle premise of (11).

One can adopt the view that many propositions have relational parts,introducing a variable ‘R’ intended to range overrelations; see the entries onmedieval relations, andmedieval terms. One can also formulatethe following schema: somePR everyD, andsomeD isL; so somePR someL. But the problem remains. Quantifiers can appear incomplex predicates that figure in valid inferences like (12).

(12)Every patient who respects every doctor is sick, and
some patient who saw every lawyer respects every doctor; so
some patient who saw every lawyer is sick.

But if ‘patient who respects every doctor’ and‘patient who saw every lawyer’ are nonrelational, muchlike ‘old patient’ or ‘young patient’, then(12) has the following form: everyO isS, and someYR everyD; so someY isS. And many inferences of this form are invalid. For example:every otter is sick, and some yak respects every doctor; so some yakis sick. Again, one can abstract a valid schema that covers (12),letting parentheses indicate a relative clause that restricts theadjacent predicate.

EveryP(R1 everyD) isS, and someP(R2 everyL)R1everyD; so someP(R2 everyL) isS.

But no matter how complex the schema, the relevant predicates canexhibit further quantificational structure. (Consider the propositionthat every patient who met some doctor who saw no lawyerrespects some lawyer who saw no patient who met everydoctor.) Moreover, schemata like the one above are poorcandidates for basic inference patterns.

As medieval logicians knew, propositions expressed with relativeclauses also pose other difficulties; see the entry onmedieval syllogism. If every doctoris healthy, it follows that every young doctor is healthy. By itself,this is expected, since a universally quantified subject is governedby the non-default (de nullo) inference rule that licensesreplacement of ‘doctor’ with the more restrictive‘young doctor’. But consider (13) and (14).

(13)No patient who saw every young doctor is healthy
(14)No patient who saw every doctor is healthy

Here, the direction of valid inference is from ‘youngdoctor’ to ‘doctor’, as if the inference is governedby the default (de omni) inferential rule. One can say thatthe default direction of implication, from more restrictive to lessrestrictive predicates, has been inverted twice—once by‘No’, and once by ‘every’. But one wants asystematic account of propositional structure that explains the neteffect; see Ludlow (2002) for further discussion. Sommers (1982)offers a strategy for recoding and extending classical logic, in partby exploiting an idea suggested by Leibniz (and arguably Panini): arelational sentence like ‘Juliet loved Romeo’ somehowcombines an active-voice sentence with a passive-voice sentence,perhaps along the lines of ‘Juliet loved,and therebyRomeo was loved’; cp. section ninebelow. But if impeccability is to be revealed as a matter of form,then one way or another, quantifiers need to characterized in a waythat captures their general logical role—and not just their roleas potential subjects of Aristotelian propsitions. Quantifiers are notsimply devices for creating schemata like ‘EveryS isP’, into which general terms like‘politician’ and ‘deceitful’ can beinserted. Instances of ‘S’ and‘P’ can themselves have quantificationalstructure and relational constituents.

4. Frege and Formal Language

Frege showed how to resolve these difficulties for classical logicin one fell swoop. His system of logic, published in 1879 and still inuse (with notational modifications), was arguably the single greatestcontribution to the subject. So it is significant that on Frege'sview, propositions do not have subject-predicate form. His accountrequired a substantial distinction between logical form andgrammatical form as traditionally conceived. It is hard tooveremphasize the impact of this point on subsequent discussions ofthought and its relation to language.

Frege's leading idea was that propositions have“function-argument” structure. Though for Frege, functionsare not abstract objects. In particular, while a function maps eachentity in some domain onto exactly one entity in some range, Frege(1891) does not identify functions with sets of ordered pairs. On thecontrary, he says that a function “by itself must be calledincomplete, in need of supplementation, or unsaturated. And in thisrespect functions differ fundamentally from numbers (p. 133).”For example, we can represent the successor function as follows, withthe integers as the relevant domain for the variable ‘x’:S(x) = x + 1. This function maps zero onto one, one onto two, and soon. We can specify a corresponding object—e.g., the set{⟨x, y⟩: y = x + 1}—as the “value-range”of the successor function. But according to Frege, any particularargument (e.g., the number one) “goes together with the functionto make up a complete whole” (e.g., the number two); and anumber does not go together with a set in this fashion. Put anotherway, while each number is an object, a mapping from numbers to numbersis not an additional object in Frege’s sense. As Frege noted,the word ‘function’ is often used to talk about what hewould call the value-range of a function. But he maintained that thenotion of an unsaturated function, which may be applied to endlesslymany arguments, is “logically prior” to any notion of aset with endlessly many arguments that are specified functionally asin {⟨x, y⟩: y = x + 1}; see p.135, note E.

Functions need not be unary. For example, arithmetic division can berepresented as a function from ordered pairs of numbers ontoquotients: Q(x, y) = x/y. Mappings can also be conditional. Considerthe function that maps every even integer onto itself, and every oddinteger onto its successor: C(x) = x if x is even, and x + 1otherwise; C(1) = 2, C(2) = 2, C(3) = 4, etc. Frege held thatpropositions have parts that indicate functions, and in particular,conditional functions that map arguments onto special values thatreflect the truth or falsity of propositions/sentences. (As discussedbelow, Frege [1892] also distinguished these “truthvalues” from what he called Thoughts [Gedanken] or the“senses” [Sinnen] of propositions; where each of thesesentential senses “presents” a truth value in certainway—i.e., as the value of a certain indicated function given acertain indicated argument.).

Variable letters, such as ‘x’ and ‘y’ in‘Q(x, y) = x/y’, are typographically convenient forrepresenting functions that take more than one argument. But we couldalso index argument places, as shown below.

Q[(  )i , ( )j] = (  )i / ( )j

Or we could replace the subscripts above with lines that connecteach pair of round brackets on the left of ‘=’ to acorresponding pair of brackets on the right. But the idea, however weencode it, is that a proposition has at least one constituent that issaturated by the requisite number of arguments. (If it helps, think ofan unsaturated proposition-part as the result of abstracting away fromone or more arguments in a complete proposition.) Frege was hereinfluenced by Kant's discussion of judgment, and the ancientobservation that merely combining two things does not make thecombination truth-evaluable. So in saying that propositions have“function-argument” structure, Frege was not onlyrejecting the traditional idea that logical from reflects the“subject-predicate” structure of ordinary sentences, hewas suggesting that propositions exhibit a special kind of unity:unlike a mere concatenation of objects, a potential premise/conclusionis formed by saturating an unsaturated mapping with a suitableargument.

On Frege's view, theproposition that Mary sang has afunctional component indicated by ‘sang’ and an argumentindicated by ‘Mary’, even if the Englishsentence‘Mary sang’ has ‘Mary’ as its subject and‘sang’ as its predicate. The proposition can berepresented as follows: Sang(Mary). Frege thought of the relevantfunction as a conditional mapping from individuals to truth values:Sang(x) =T if x sang, andFotherwise; where ‘T’ and‘F’ stand for special entities such thatfor each individual x, Sang(x) =T if and only if xsang, and Sang(x) =F if and only if x did notsing. According to Frege, the proposition that John admires Marycombines an ordered pair of arguments with a functional componentindicated by the transitive verb: Admires(John, Mary); where for anyindividual x, and any individual y, Admires(x, y) =Tif x admires y, andF otherwise. From thisperspective, the structure and constituents are the same in theproposition that Mary is admired by John, even though‘Mary’ is the grammatical subject of the passivesentence. Likewise, Frege did not distinguish the proposition thatthree precedes four from the proposition that four is preceded bythree. More importantly, Frege's treatment of quantified propositionsdeparts radically from the traditional idea that the grammaticalstructure of sentence reflects the logical structure of the indicatedproposition.

If S is the function indicated by ‘sang’, then Marysang iff—i.e., if and only if—S(Mary) =T. Likewise, someone sang iff: S maps someindividual ontoT; that is, for some individual x,S(x) =T. Or using a modern variant of Frege'soriginal notation, someone sang iff ∃x[S(x)]. The quantifier‘∃x’ is said to bind the variable ‘x’,which ranges over individual things in a domain of discourse. (Fornow, assume that the domain contains only people.) If every individualin the domain sang, then S maps every individual onto the truth valueT; or using formal notation, ∀x[S(x)]. Aquantifier binds each occurrence of its variable, as in‘∃x[P(x) & D(x)]’, which reflects the logicalform of ‘Someone is both a politician and deceitful’. Inthis last example, the quantifier combines with a complex predicatethat formed by conjoining two simpler predicates.

With regard to the proposition that some politician is deceitful,traditional grammar suggests the division ‘Some politician / isdeceitful’, with the noun ‘politician’ forming aconstituent with the quantificational word. But on a Fregean view,grammar masks the logical division between the existential quantifierand the rest: ∃x[P(x) & D(x)]. With regard to theproposition that every politician is deceitful, Frege also stressesthe logical division between the quantifier and its scope:∀x[P(x) → D(x)]; every individual is deceitful if apolitician. Here too, the quantifier combines with a complexpredicate, albeit a conditional rather than conjunctivepredicate. (The formal sentence ‘∀x[P(x) &D(x)]’ implies, unconditionally, that every individual is apolitician.) As Frege (1879) defined his analogs of the relevantmodern symbols used here, ‘P(x) → D(x)’ is equivalentto ‘¬P(x) ∨ D(x)’, and ‘∀x’ isequivalent to ‘¬∃x¬’. So‘∀x[P(x) → D(x)]’ is equivalent to‘¬∃x¬[¬P(x) ∨ D(x)]’; and given deMorgan's Laws (concerning the relations between negation, disjunction,and conjunction), ¬∃x¬[¬P(x) ∨ D(x)] iff¬∃x[P(x) & ¬D(x)]. Hence, ∀x[P(x) →D(x)] iff ¬∃x[P(x) & ¬D(x)]. This captures the ideathat every politician is deceitful iff no individual is both apolitician and not deceitful.

If this conception of logical form is correct, then grammar ismisleading in several respects. First, grammar leads us to think that‘some politician’ indicates a constituent of theproposition that some politician is deceitful. Second, grammar masks adifference between existential and universally quantifiedpropositions; predicates are related conjunctively in the former, andconditionally in the latter. (Though as discussedinsection seven, one can—and Frege [1884]did—adopt a different view that allows for relational/restrictedquantifiers as in ‘∀x:P(x)[D(x)]’.)

More importantly, Frege's account was designed toapply equally well to propositions involving relations and multiplequantifiers. And with regard to these propositions, there seems to bea big difference between logical structure and grammaticalstructure.

On Frege's view, a single quantifier can bind an unsaturatedposition that is associated with a function that takes a singleargument. But it is equally true that two quantifiers can bind twounsaturated positions associated with a function that takes a pair ofarguments. For example, the proposition that everyone likes everyonecan be represented with the formal sentence‘∀x∀y[L(x, y)]’. Assuming that‘Romeo’ and ‘Juliet’ indicate arguments, itfollows that Romeo likes everyone, and that everyone likesJuliet—∀y[L(r, y)] and ∀x[L(x, j)]. And it followsfrom all three propositions that Romeo likes Juliet: L(r, j). Therules of inference for Frege's logic capture this general feature ofthe universal quantifier. A variable bound by a universal quantifiercan be replaced with a name for some individual in thedomain. Correlatively, a name can be replaced with a variable bound byan existential quantifier. Given that Romeo likes Juliet, it followsthat someone likes Juliet, and Romeo likes someone. Frege's formalismcan capture this as well: L(r, j); so ∃x[L(x, j)] &∃x[L(r, x)]. And given either conjunct in the conclusion, itfollows that someone likes someone: ∃x∃y[L(x, y)]. Asingle quantifier can also bind multiple argument positions, as in‘∃x[L(x, x)]’, which is true iff someone likesherself. Putting these points schematically:∀x(…x…), so …n…; and…n…, so ∃x(…x…).

Mixed quantification introduces an interesting wrinkle. Thepropositions expressed with ‘∃x∀y[L(x,y)]’and ‘∀y∃x[L(x,y)]’ differ. We can paraphrasethe first as ‘there is someone who likes everyone’ and thesecond as ‘everyone is liked by someone or other’. Thesecond follows from the first, but not vice versa. This suggests that‘someone likes everyone’ is ambiguous, in that this stringof English words can be used to express two differentpropositions. This in turn raises difficult questions about whatnatural language expressions are, and how they can be used to expresspropositions; seesection eight. But for Frege,the important point concerned the distinction between the propositions(Gedanken). Similar remarks apply to ‘∀x∃y[L(x,y)]’ and ‘∃y∀x[L(x, y)]’.

A related phenomenon is exhibited by ‘John danced if Marysang and Chris slept’. Is the intended proposition of the form‘(A ifB) andC’ or ‘A if(B andC)’? Indeed, it seemsthat the relation between word-strings and propositions expressed isoften one-to-many. Is someone who says ‘The artist drew aclub’ talking about a sketch or a card game? One can use‘is’ to express identity, as in ‘Hesperus is theplanet Venus’; but in ‘Hesperus is bright’,‘is’ indicates predication. In ‘Hesperus is aplanet’, ‘a’ seems to be logically inert; yet in‘John saw a planet’, ‘a’ seems to indicateexistential quantification: ∃x[P(x) & S(j,x)]. (One canrender ‘Hesperus is a planet’ as ‘∃x[P(x)& h = x]’. But this treats ‘is a planet’ asimportantly different than ‘is bright’; and this leads toother difficulties.) According to Frege, such ambiguities providefurther evidence that natural language is not suited to the task ofrepresenting propositions and inferential relations perspicuously. Andhe wanted a language that was suited for this task. (Leibniz andothers had envisioned a “Characteristica Universalis”, butwithout detailed proposals for how to proceed beyond syllogistic logicin creating one.) This is not to deny that natural language is wellsuited for other purposes, perhaps including efficient humancommunication. And Frege held that we often do use natural languageto express propositions. But he suggested that natural language islike the eye, whereas a good formal language is like a microscope thatreveals structure not otherwise observable. On this view, the logicalform of a proposition is made manifest by the structure of a sentencein an ideal formal language—what Frege called a Begriffsschrift(concept-script); where the sentences of such a language exhibitfunction-argument structures that differ in kind from the grammaticalstructures exhibited by the sentences we use in ordinarycommunication.

The real power of Frege's strategy for representing propositionalstructure is most evident in his discussions of proofs by induction,the Dedekind-Peano axioms for arithemetic, and how the propositionthat every number has a successor is logically related to more basictruths of arithmetic; see the entry onFrege's theorem and foundations for arithmetic. But without getting into these details, one can get asense of Frege's improvement on previous logic by considering(15–16) and Fregean analyses of the correspondingpropositions.

(15)Every patient respects some doctor

∀x{P(x) →∃y[D(y) & R(x,y)]}
(16)Every old patient respects some doctor

∀x{[O(x) & P(x)] →∃y[D(y) & R(x,y)]}

Suppose that every individual has the following conditional property:if hex is a patient, then some individual is suchthat shey is both a doctor and respected byhimx. Then it follows—intuitively and giventhe rules of Frege's logic—that everyindividualx has the following conditional property:if hex is both old and a patient, then someindividualy is such that shey isboth a doctor and respected by himx. So theproposition expressed with (16) follows from the one expressed with(15). More interestingly, we can also account for why the propositionexpressed with (14) follows from the one expressed with (13).

(13)No patient who saw every young doctor is healthy

¬∃x{P(x) &∀y{[Y(y) & D(y) → S(x,y)] & H(x)}
(14)No patient who saw every doctor is healthy

¬∃x{P(x) &∀y[D(y) → S(x,y)] & H(x)}

For suppose it is false that some individual has the followingconjunctive property: hex is a patient; andhex saw every young doctor (i.e., everyindividualy is such that if sheyis a young doctor, then hex was seen byhery); and hex is healthy. Thenintuitively, and also given the rules of Frege's logic, it is falsethat some individual has the following conjunctive property:hex is a patient; and hex sawevery doctor; and hex is healthy. This explains whythe direction of valid inference is from the more restrictive‘young doctor’ in (13) to the less restrictive‘patient’ in (14), despite the fact that in simpler cases,replacing ‘every doctor’ with ‘every youngdoctor’ is valid. More generally, Frege's logic handles a widerange of inferences that had puzzled medieval logicians. But theFregean logical forms seem to differ dramatically from the grammaticalforms of sentences like (13–16). Frege concluded that we need aBegriffsschrift, distinct from the languages we naturally speak, inorder to depict (and help us discern) the structures of thepropositions we express by using natural languages.

Frege also made a different kind of contribution, which would proveimportant, to the study of propositions. In early work, he spoke asthough propositional constituents were the relevant functions and(ordered n-tuples of) entities that such functions map totruth-values. But he later refined this view in light of hisdistinction between Sinn and Bedeutung (see the entry onGottlob Frege). The Sinn of an expression wassaid to be a “way of presenting” the correspondingBedeutung, which might be an entity, a truth-value, or a function from(ordered n-tuples of) entities to truth-values. The basic idea is thattwo names, like ‘Hesperus’ and ‘Phosphorus’,can present the same Bedeutung in different ways; in which case, theSinn of the first name differs from the Sinn of the second. Given thisdistinction, we can think of ‘Hesperus’ as an expressionthat presents the evening star (a.k.a. Venus) as such, while‘Phosphorus’ presents the morning star (also a.k.a. Venus)in a different way. Likewise, we can think of ‘is bright’as an expression that presents a certain function in a certain way,and ‘Hesperus is bright’ as a sentence that presents itstruth-value in a certain way—i.e., as the value of the functionin question given the argument in question. From this perspective,propositions are sentential ways of presenting truth-values, andproposition-parts are subsentential ways of presenting functions andarguments. Frege could thus distinguish the proposition that Hesperusis bright from the proposition that Phosphorus is bright, even thoughthe two propositions are alike with regard to the relevant functionand argument. Likewise, he could distinguish the trivial propositionHesperus is Hesperus from the (apparently nontrivial) propositionHesperus is Phosphorus. This is an attractive view. For intuitively,ancient astronomers were correct not to regard the inference Hesperusis Hesperus, so Hesperus is Phosphorus as an instance of the followingvalid schema:A, soA. But thisraised questions about what the Sinn of an expression really is, what“presentation” could amount to, and what to say about aname with no Bedeutung.

5. Descriptions and Analysis

Frege did not distinguish (or at least did not emphasize anydistinction between) names like ‘John’ and descriptionslike ‘the boy’ or ‘the tall boy from Canada’.Initially, both kinds of expression seem to indicate arguments, asopposed to functions. So one might think that the logical form of‘The boy sang’ is simply ‘S(b)’, where‘b’ is an unstructured symbol that stands for the boy inquestion (and presents him in a certain way). But this makes theelements of a description logically irrelevant. And this seemswrong. If the tall boy from Canada sang, then some boy from Canadasang. Moreover, ‘the’ impliesuniqueness in a waythat ‘some’ does not. Of course, one can say ‘Theboy sang’ without denying that universe contains more than oneboy. But likewise, in ordinary conversation, one can say‘Everything is in the trunk’ without denying that theuniverse contains some things not in the trunk. And intuitively, aspeaker who uses ‘the’ does imply that the adjacentpredicate is satisfied by exactly one contextually relevant thing.

Bertrand Russell held that theseimplications reflect the logical form of a proposition expressed (in agiven context) with a definite description. On his view, ‘Theboy sang’ has the following logical form: ∃x{Boy(x) &∀y[Boy(y) → y = x] & S(x)}; someindividualx is such that hex is aboy, and every (relevant) individualy is such thatif hey is a boy, then hey isidentical with himx, and hexsang. The awkward middle conjunct was Russell's way of expressinguniqueness with Fregean tools; cf.section seven. But rewriting the middle conjunct would not affectRussell's technical point, which is that ‘the boy’ doesnot correspond to any constituent of the formalism. This in turnreflects Russell's central claim—viz., that while a speaker mayrefer to a certain boy in saying ‘The boy sang’, the boyin question is not a constituent of the propositionindicated. According to Russell, the proposition has the form of anexistential quantification with a bound variable. It doesnothave the form of a function saturated by (an argument that is) the boyreferred to. The proposition is general rather than singular. In thisrespect, ‘the boy’ is like ‘some boy’ and‘every boy’; though on Russell's view, not even‘the’ indicates a constituent of the propositionexpressed.

This extended Frege's idea that natural language misleads us about thestructure of the propositions we assert. Russell went on to apply thishypothesis to what became a famous puzzle. Even though France iscurrently kingless, ‘The present king of France is bald’can be used to express a proposition. The sentence is not meaningless;it has implications. So if the proposition consists of the functionindicated with ‘Bald( )’ and an argument indicatedwith ‘The present king of France’, there must be anargument so indicated. But appeal to nonexistent kings is, to say theleast, dubious. Russell concluded that ‘The present king ofFrance is bald’ expresses a quantificational proposition:∃x{K(x) & ∀y[K(y) → y = x] & B(x)}; whereK(x) =T iff x is a present king of France, and B(x)=T iff x is bald. (For present purposes, set asideworries about the vagueness of ‘bald’.) And as Russellnoted, the following contrary reasoning is spurious: every propositionis true or false; so the present king of France is bald or not; sothere is a king of France, and he is either bald or not. For letP be the proposition that the king of France isbald. Russell held thatPis indeed true or false. Onhis view, it is false. Given that ¬∃x[K(x)], it follows that¬∃x{K(x) & ∀y[K(y) → y = x] & B(x)}. Butit does not follow that there is a present king of France who iseither bald or not. Given that ¬∃x[K(x)], it hardly followsthat ∃x{K(x) & [B(x) v ¬B(x)]}. So we must not confusethe negation ofP with the following falseproposition: ∃x{K(x) & ∀y[K(y) → y = x] &¬B(x)}. The ambiguity of natural language may foster suchconfusion, given examples like ‘The present king of France isbald or not’. But according to Russell, puzzles about“nonexistence” can be resolved without specialmetaphysical theses, given the right views about logical form andnatural language.

This invited the thought that other philosophical puzzles mightdissolve if we properly understood the logical forms of ourclaims. Wittgenstein argued, in his influentialTractatusLogico-Philosophicus, that: (i) the very possibility ofmeaningful sentences, which can be true or false depending on how theworld is, requires propositions with structures of the sort Frege andRussell were getting at; (ii) all propositions are logical compoundsof—and thus analyzable into—atomic propositions that areinferentially independent of one another; though (iii) even simplenatural language sentences may indicate very complex propositions; and(iv) the right analyses would, given a little reflection, reveal allphilosophical puzzles as confusions about how language is related tothe world. Russell never endorsed (iv). And Wittgenstein later notedthat claims like ‘This is red’ and ‘This isyellow’ presented difficulties for his earlier view. If theexpressed propositions are unanalyzable, and thus logicallyindependent, each should be compatible with the other. But at least sofar, no one has provided a plausible analysis that accounts for theapparent impeccabilty of ‘This is red, so this is notyellow’. (This raises questions about whetherallinferential security is due to logical form.) Though for reasonsrelated to epistemological puzzles, Russell did say that (a) we aredirectly acquainted with the constituents of thosepropositions into which every proposition (that we can grasp) can beanalyzed; (b) at least typically, we are not directly acquainted withthe mind-independent bearers of proper names; and so (c) the things wetypically refer to with names are not constituents of basicpropositions.

This led Russell to say that natural language names are disguiseddescriptions. On this view, ‘Hesperus’ is semanticallyassociated with a complex predicate—say, for illustration, apredicate of the form ‘E(x) & S(x)’, suggesting‘evening star’. In which case, ‘Hesperus isbright’ expresses a proposition of the form‘∃x{[E(x) & S(x)] & ∀y{[E(y) & S(y)]→ y = x]} & B(x)}’. It also follows that Hesperusexists iff ∃x[E(x) & S(x)]; and this would be challenged byKripke (1980); see the entries onrigid-designators andnames. But by analyzing names asdescriptions—quantificational expressions, as opposed to logicalconstants (like ‘b’) that indicateindividuals—Russell offered an attractive account of why theproposition that Hesperus is bright differs from the proposition thatPhosphorus is bright. Instead of saying that propositionalconstituents are Fregean senses, Russell could say that‘Phosphorus is bright’ expresses a proposition of the form‘∃x{[M(x) & S(x)] & ∀y{[M(y) & S(y)]→ y = x]} & B(x)’; where ‘E(x)’ and‘M(x)’ indicate different functions, specified(respectively) in terms of evenings and mornings. This leaves room forthe discovery that the complex predicates ‘E(x) &S(x)’ and ‘M(x) & S(x)’ both indicate functionsthat map Venus and nothing else to the truth-valueT. The hypothesis was that the propositions expressedwith ‘Hesperus is bright’ and ‘Phosphorus isbright’ have different (fundamental) constituents, even thoughHesperus is Phosphorus, but not because propositional constituents are“ways of presenting” Bedeutungen. Similarly, the idea wasthat the propositions expressed with ‘Hesperus isHesperus’ and ‘Hesperus is Phosphorus’ differ,because only the latter has predicational/unsaturated constituentscorresponding to ‘Phosphorus’. Positing unexpected logicalforms seemed to have explanatory payoffs.

Questions about names and descriptions are also related topsychological reports, like ‘Mary thinks Venus is bright’,which present puzzles of their own; see the entry onpropositional attitude reports. Such reports seem to indicate propositions that areneither atomic nor logical compounds of simpler propositions. For asFrege noted, replacing one name with another name for the same objectcan apprarently affect the truth of a psychological report. If Maryfails to know that Hesperus is Venus, she might think Venus is aplanet without thinking Hesperus is a planet; though cp. Soames (1987,1995, 2002) and see the entry onsingular propositions. Anyfunction that has the valueT given Venus as argumenthas the valueT given Hesperus as argument. So Frege,Russell, and Wittgenstein all held—in varying ways—thatpsychological reports are also misleading with respect to the logicalforms of the indicated propositions.

6. Regimentation and Communicative Slack

Within the analytic tradition inspired by these philosophers, itbecame a commonplace that logical form and grammatical form typicallydiverge, often in dramatic ways. This invited attempts to provideanalyses of propositions, and accounts of natural language, with theaim of saying how relatively simple sentences (with subject-predicatestructures) could be used to express propositions (withfunction-argument structures).

The logical positivists explored the idea that the meaning of asentence is a procedure for determining the truth or falsity of thatsentence. From this perspective, studies of linguistic meaning andpropositional structure still dovetail, even if natural languageemploys “conventions” that make it possible to indicatecomplex propositions with grammatically simple sentences; see theentry onanalysis. But to cut short a longand interesting story, there was little success in formulating“semantic rules” that were plausible both as (i)descriptions of how ordinary speakers understand sentences of naturallanguage, and (ii) analyses that revealed logical structure of thesort envisioned. (And until Montague [1970], discussed briefly in thenext section, there was no real progress in showing how tosystematically associate quantificational constructions of naturallanguage with Fregean logical forms.)

Rudolf Carnap, one of theleading positivists, responded to difficulties facing his earlierviews by developing a sophisticated position according to whichphilosophers could (and should) articulate alternative sets ofconventions for associating sentences of a language withpropositions. Within each such language, the conventions woulddetermine what follows from what. But one would have to decide, onbroadly pragmatic grounds, which interpreted language was best forcertain purposes (like conducting scientific inquiry). On this view,questions about “the” logical form of an ordinary sentenceare in part questions about which conventions one should adopt. Theidea was that “internal” to any logically perspicuouslinguistic scheme, there would be an answer to the question of how twosentences are inferentially related. But “external”questions, about which conventions we should adopt, would not besettled by descriptive facts about how we understand languages that wealready use.

This was, in many ways, an attractive development of Frege's vision.But it also raised a skeptical worry. Perhaps the structuralmismatches between sentences of a natural language and sentences of aFregean Begriffsschrift are so severe that one cannot formulategeneral rules for associating the sentences we ordinarily use withpropositions. Later theorists would combine this view with the ideathat propositions are sentences of a mental language that isrelevantly like Frege's invented language and relevantly unlike thespoken languages humans use to communicate; see Fodor (1975,1978). But given the rise ofbehaviorism, both in philosophy andpsychology, this variant on a medieval idea was initially ignored orridiculed. (And it does face difficulties; seesection 8.)

Willard Van Orman Quine combined behavioristpsychology with a normative conception of logical form similar toCarnap's. The result was an influential view according to which thereis no fact of the matter about which proposition a speaker/thinkerexpresses with a sentence of natural language, because talk ofpropositions is (at best) a way of talking about how we shouldregiment our verbal behavior for certain purposes—and inparticular, for purposes of scientific inquiry. On this view, claimsabout logical form are evaluative, and such claims are underdeterminedby the totality of facts concerning speakers' dispositions to uselanguage. From this perspective, mismatches between logical andgrammatical form are to be expected, and we should not conclude thatordinary speakers have mental representations that are isomorphic withsentences of a Fregean Begriffsschrift.

According to Quine, speakers' behavioral dispositions constrain whatcan be plausibly said about how to best regiment their language. Healso allowed for some general constraints on interpretability that anidealized “field linguist” might impose in coming up witha regimented interpretation scheme. (Donald Davidson developed a similar line of thought in a lessbehavioristic idiom, speaking in terms of constraints on a“Radical Interpreter,” who seeks “charitable”construals of alien speech.) But unsurprisingly, this left ample roomfor “slack” with respect to which logical forms should beassociated with a given sentential utterance.

Quine also held that decisions about how to make such associationsshould be madeholistically. As he sometimes put it, the“unit of translation” is an entire language, not aparticular sentence. On this view, one can translate a sentenceS of a natural language NL with a structurally mismatchingsentenceµof a formal language FL, even if it seems (locally) implausiblethatS is used to express the proposition associated withµ, so long as the following condition is met: the associationbetweenS and µ is part of a general account of NL andFL that figures in an overall theory—which includes an accountof language, logic, and the language-independent world—that isamong the best overall theories available. This holistic conception ofhow to evaluate proposed regimentations of natural language was partand parcel of Quine's criticism of the early positivists'analytic-synthetic distinction, andhis more radical suggestion that there is nosuch distinction.

The suggestion was that even apparently tautologous sentences, like‘Bachelors are unmarried’ and ‘Caesar died if Brutuskilled him’, have empirical content. These may be among the lastsentences we would dissent from, faced with recalcitrant experience;we may prefer to say that Caesar didn't really die, or that Brutusdidn't really kill him, if the next best alternative is to deny theconditional claim. But for Quine, every meaningful claim is a claimthat could turn out to be false—and so a claim we must beprepared, at least in principle, to reject. Correlatively, nosentences are known to be true simply by knowing what they mean (andknowinga priori that sentences with such meanings must betrue).

For present purposes, we can abstract away from the details of debatesabout whether Quine's overall view was plausible. Here, the importantpoint is that claims about logical form were said to be (at leastpartly) claims about the kind of regimented language weshould use, not claims about the propositions actuallyexpressed with sentences of natural language. And one aspect ofQuine's view, about the kind of regimented language weshoulduse, turned out to be especially important for subsequent discussionsof logical form. For even among those who rejected the behavioristicassumptions that animated Quine's conception of language, it was oftenheld that logical forms are expressions of a first-order predicatecalculus.

Frege's Begriffsschrift, recall, was designed to capture theDedekind-Peano axioms for arithmetic, including the axiom ofinduction; see the entry onFrege's theorem and foundations for arithmetic. This requiredquantification into positions occupiable by predicates, as well aspositions occupiable by names. Using modern notation, Frege allowedfor formulae like ‘(Fa & Fb) → ∃X(Xa &Xb)’ and ‘∀x∀y[x = y ↔ ∀X(Xx↔ Xy)]’. And he took second-order quantification to bequantification over functions. This is to say, for example, that‘∃X(Xa & Xb)’ is true iff: there is a function,X, that maps both the individual called ‘a’ and theindividual called ‘b’ onto the truth-valueT. Frege also took it to be a truth of logic that forany predicateP, there is a function such that for eachindividual x, that function maps x toT iff xsatisfies (or “falls under”)P. In which case,for each predicate, there is the set of all and only the things thatsatisfy the predicate. The axioms for Frege's logic thus generatedRussell's paradox, given predicateslike ‘is not a member of itself’. This invited attempts toweaken the axioms, while preserving second-orderquantification. But for various reasons, Quine and others advocated arestriction to a first-order fragment of Frege's logic, disallowingquantification into positions occupied by predicates. (Godel hadproved the completeness of first-order predicate calculus, thusproviding a purely formal criterion for what followed from what inthat language. Quine also held that second-order quantificationillicitly treated predicates as names for sets, thereby spoilingFrege's conception of propositions as unified by virtue of havingunsaturated predicational constituents that are satisfied by thingsdenoted by names.) On Quine's view, we should replace ‘(Fa &Fb) → ∃X(Xa & Xb)’ with explicit first-orderquantification over sets, as in ‘(Fa & Fb) →∃s(a∈s & b∈s)’; where ‘∈’stands for ‘is an element of’, and this second conditionalis not a logical truth, but rather a hypothesis (to be evaluatedholistically) concerning sets.

The preference for first-order regimentations has come to seemunwarranted, or at least highly tendentious; see Boolos (1998). But itfueled the idea that logical form can diverge wildly from grammaticalform. For as students quickly learn, first-order regimentations ofnatural sentences often turn out to be highly artificial. (And in somecases, such regimentations seem to be unavailable.) This was, however,taken to show that natural languages are far from ideal for purposesof indicating logical structure.

A different strand of thought in analytic philosophy—pressed byWittgenstein inPhilosophical Investigations and developed byothers, including Strawson and Austin—also suggested that asingle sentence could be used (on different occasions) to expressdifferent kinds of propositions. Strawson (1950) argued thatpace Russell, a speaker could use an instance of ‘TheF isG’ to express a singular propositionabout a specific individual: namely, theF in the context athand. According to Strawson, sentences themselves do not have truthconditions, since sentences (as opposed to speakers) do not expresspropositions; and speakers can use ‘The boy is tall’ toexpress a proposition with the contextually relevant boy as aconstituent. Donnellan (1966) went on to argue that a speaker couldeven use an instance of ‘TheF isG’ toexpress a singular proposition about an individual that isn't anF; see the entry onreference.Such considerations, which have received a great deal of attention inrecent discussions of context dependence, suggested that relationsbetween natural language sentences and propositions are (at best) verycomplex and mediated by speakers' intentions. All of which made itseem that such relations are far more tenuous than the pre-Fregeantradition suggested. This bolstered the Quine/Carnap idea thatquestions about the structure of premises and conclusions are reallyquestions about how weshould talk (when trying to describethe world), much as logic itself seems to be more concerned with howwe should infer than with how we do infer. From this perspective, theconnections between logic and grammar seemed rather shallow.

7. Notation and Restricted Quantification

On the other hand, more recent work on quantifiers suggests thatthe divergence had been exaggerated, in part because of how Frege'sidea of variable-binding was originally implemented. Consider againthe proposition that some boy sang, and the proposed logical divisioninto the quantifier and the rest: ∃x[Boy(x) & Sang(x)];something is both a boy and an individual that sang. This is one wayto regiment the English sentence. But one can also offer a logicalparaphrase that more closely parallels the grammatical divisionbetween ‘some boy’ and ‘sang’: for someindividual x such that x is a boy, x sang. One can formalize thisparaphrase with restricted quantifiers, which incorporate arestriction on the domain over which the variable in questionranges. For example, ‘∃x:B(x)’ can be an existentialquantifier that binds a variable ranging over the boys in the relevantdomain, with ‘∃x:B(x)[S(x)]’ being true iff some boysang. Since ‘∃x:B(x)[S(x)]’ and ‘∃x[B(x)& S(x)]’ are logically equivalent, logic provides no reasonfor preferring the latter regimentation of the English sentence. Andchoosing the latter does not show that the proposition expressed with‘Some boy sang’ has a structure that differs fromgrammatical structure of the sentence.

Universal quantifiers can also be restricted, as in‘∀x:B(x)[S(x)]’, interpreted as follows: for everyindividual x such that x is a boy, x sang. Restrictors can also belogically complex, as in ‘Some boy from Canada sang’ or‘Some boy who respects Mary sang’, rendered as‘∃x:B(x) & F(x, c)[S(x)]’ and‘∃x:B(x) & R(x, m)[S(x)]’. Given theserepresentations, the inferential difference between ‘some boysang’ and ‘every boy sang’ lies with thepropositional contributions of ‘some’ and‘every’ after all, and not partly with the contribution ofconnectives like ‘&’ and ‘→’.

Words like ‘someone’, and the grammatical requirementthat ‘every’ be followed by a noun (or noun phrase),reflect the fact that natural language employs restricted quantifiers.Phrases like ‘every boy’ are composed of a determiner anda noun. Correspondingly, one can think of determiners as expressionsthat can combine with an ordered pair of predicates to form asentence, much as one can think of transitive verbs as expressionsthat can combine with an ordered pair of names to form a sentence. Andthis grammatical analogy, between determiners and transitive verbs,has a semantic correlate.

Since ‘x’ and ‘y’ are variables ranging overindividuals, one can say that the function indicated by the transitiveverb ‘likes’ yields the valueT given theordered pair ⟨x,y⟩ as argument if and only if x likes y. Inthis notational scheme, ‘y’ corresponds to the directobject (or internal argument), which combines with the verb to form aphrase; ‘x’ corresponds to the grammatical subject (orexternal argument) of the verb. If we think about ‘every boysang’ analogously, ‘boy’ is the internal argument of‘every’, since ‘every boy’ is a phrase. Bycontrast, ‘boy’ and ‘sang’ do not form aphrase in ‘every boy sang’. So let us introduce‘X’ and ‘Y’ as second-order variables rangingover functions, from individuals to truth values, stipulating that theextension of such a function is the set of things that the functionmaps onto the truth valueT. Then one can say thatthe function indicated by ‘every’ yields the valueT given the ordered pair ⟨X, Y⟩ as argumentiff the extension of X includes the extension of Y. Similarly, onecan say that the function indicated by ‘some’ maps theordered pair ⟨X, Y⟩ ontoT iff theextension of X intersects with the extension of Y.

Just as we can describe ‘likes’ as a predicate satisfiedby ordered pairs ⟨x, y⟩ such that x likes y, so we can thinkabout ‘every’ as a predicate satisfied by ordered pairs⟨X, Y⟩ such that the extension of X includes the extensionof Y. (This is compatible with thinking about ‘every boy’as a restricted quantifier that combines with a predicate to form asentence that is true iff every boy satisfies that predicate.) Onevirtue of this notational scheme is that it lets us representrelations between predicates that cannot be captured with‘∀’, ‘∃’, and the sententialconnectives; see Rescher (1962), Wiggins (1980). For example, mostboys sang iff the boys who sang outnumber the boys who did notsing. So we can say that ‘most’ indicates a function thatmaps ⟨X, Y⟩ toT iff the number of thingsthat both Y and X map toT exceeds the number ofthings that Y but not X maps toT.

Using restricted quantifiers, and thinking about determiners asdevices for indicating relations between functions, also suggests analternative to Russell's treatment of ‘the’. The formula‘∃x{B(x) & ∀y[B(y) → x = y] &S(x)}’ can be rewritten as ‘∃x:B(x)[S(x)] & |B|= 1’, interpreted as follows: for some individual x such that xa boy, x sang, and the number of (relevant) boys is exactly one. Onthis view, ‘the boy’ still does not correspond to aconstituent of the formalism; nor does ‘the’. But one candepart farther from Russell's notation, while emphasizing his ideathat ‘the’ is relevantly like ‘some’ and‘every’. For one can analyze ‘the boy sang’ as‘!x:Boy(x)[Sang(x)]’, specifying the propositionalcontribution of ‘!’—on a par with as‘∃’ and ‘∀’—as follows:

!x:Y(x)[X(x)] =T iff the extensions of Xand Y intersect & |Y| = 1.

This way of encoding Russell's theory preserves his central claim.While there may be a certain boy that a speaker refers to in saying‘The boy sang’, that boy is not a constituent of thequantificational proposition expressed with‘!x:Boy(x)[Sang(x)]’; see Neale (1990) for discussion. Butfar from showing that the logical form of ‘The boy sang’diverges dramatically from its grammatical form, therestricted quantifier notation suggests that the logical form closelyparallelsthe grammatical form. For ‘the boy’ and‘the’ do correspond to constituents of‘!x:B(x)[S(x)]’, at least if we allow for logical formsthat represent quantificational propositions in terms of second-orderrelations; see Montague (1970).

It is worth noting, briefly, an implication of this point for theinference ‘The boy sang, so some boy sang’. If the logicalform of ‘The boy sang’ is ‘∃x:B(x)[S(x)] &|B|=1’, then the inference is an instance of the schema‘A &B, soA’. But if the logical form of ‘The boysang’ is simply ‘!(x):B(x)[S(x)]’, the premise andconclusion have the same form, differing only by substitution of‘!’ for ‘∃’. In which case, theimpeccability of the inference depends on the specific contributionsof ‘the/!’ and ‘some/∃’. Only whenthese contributions are “spelled out,” perhaps in terms ofset-intersection, would the validity of the inference be manifest; see King (2002). Soeven if grammar and logic do not diverge in this case, one might saythat grammatical structure does notreveal the logicalstructure. From this perspective, further analysis of‘the’ is required. Those who are skeptical of ananalytic/synthetic distinction can say that it remains more a decisionthan a discovery to say that ‘Some boy sang’ follows from‘The boy sang’. In general, and especially with regard toaspects of propositional form indicated with individual words, issuesabout logical form are connected with issues about theanalytic-synthetic distinction.

8. Transformational Grammar

Even given restricted quantifiers (and acceptance of second-orderlogical forms), the subject/predicate structure of ‘Juliet /likes every doctor’ diverges from the corresponding formulabelow.

∀y:Doctor(y)[Likes(Juliet, y)}.

We can rewrite ‘Likes(Juliet, y)’ as‘[Likes(y)](Juliet)’, to reflect the fact that‘likes’ combines with a direct object to form a phrase,which in turn combines with a subject. But this does not affect themain point; ‘every’ seems to be a grammatical constituentof the verb phrase ‘likes every doctor’, and yet the mainquantifier of the expressed proposition. In natural language,‘likes’ and ‘every doctor’ form a phrase. Butwith respect to logical form, ‘likes’ evidently combineswith ‘Juliet’ and a variable to form a complex predicatethat is in turn an external argument of the higher-order predicate‘every’. Similar remarks apply to ‘Some boy likesevery doctor’ and‘[∃x:Boy(x)][∀y:Doctor(y)]{Likes(x, y)]’. Soit seems that mismatches remain in the very places that troubledmedieval logicians—viz., quantificational direct objects andother examples of complex predicates with quantificationalconstituents.

Montague (1970, 1974) showed that these mismatches do not precludesystematic connections of natural language sentences with thecorresponding propositional structures. Abstracting from the technicaldetails, one can specify an algorithm that pairs each natural languagesentence that contains one or more quantificational expressions like‘every doctor’ with one or more Fregean logicalforms. This was a significant advance. Together with subsequentdevelopments, Montague's work showed that Frege's logic was compatiblewith the idea that quantificational constructions in natural languagehave a systematic semantics. Indeed, one can use Frege's formalapparatus to study such constructions. Montague himself maintainedthat the syntax of natural language was misleading for purposes of(what he took to be) real semantics. On this view, the study of validinference still suggests that natural language grammar disguises thestructure of human thought. But in thinking about the relation oflogic to grammar, one should not assume a naive conception of thelatter.

For example, the grammatical form of a sentence need not bedetermined by the linear order of its words. Using brackets todisambiguate, we can distinguish the sentence ‘Mary [saw [the[boy [with binoculars]]]]’—whose direct object is‘the boy with binoculars’—from the homophonoussentence ‘Mary [[saw [the boy]] [with binoculars]]’, inwhich ‘saw the boy’ is modified by an adverbialphrase. The first implies that the boy had binoculars, while thesecond implies that Mary used binoculars to see the boy. Thisdistinction may not be audibly marked. Nonetheless, there is adifference between modifying a noun (like ‘boy’) with aprepositional phrase and modifying a verb phrase (‘saw theboy’). More generally, grammatical structure need not beobvious. Just as it may take work to discover the kind(s) of structurethat propositions exhibit, so it may take work to discover the kind(s)of structure that sentences exhibit. And many studies of naturallanguage suggest a rich conception of grammatical form that divergesfrom traditional views; see especially Chomsky (1957, 1965, 1981,1986, 1995). So we need to ask how logical forms are related to actualgrammatical forms, which linguists try to discover, since these maydiffer importantly from any hypothesized grammatical forms that may besuggested by casual reflection on spoken language. Appearances may bemisleading with respect to both grammatical and logical form, leavingroom for the possibility that these notions of structure are not sodifferent after all.

A leading idea of modern linguistics is that at least somegrammatical structures are transformations of others. Put another way,linguistic expressions often appear to be displaced from the positionscanonically associated with certain grammatical relations that theexpressions exhibit. For example, the word ‘who’ in (17)is apparently associated with the internal (direct object) argumentposition of the verb ‘saw’.

(17)Mary wondered who John saw

Correspondingly, (17) can be glossed as ‘Mary wondered whichperson is such that John saw that person’. This invites thehypothesis that (17) reflects a transformation of the “DeepStructure” (17D) into the “Surface Structure”(17S),

(17D){Mary [wondered {John [saw who]}]}
(17S){Mary [wondered [whoi {John [saw ( _)i ]}]]}

with indices indicating agrammatical relation between theindexed positions. In (17D), the embedded clause has the same form as‘John saw Bill’. But in (17S), ‘who’ has beendisplaced from the indexed argument position. Similar remarks apply tothe question ‘Who did John see’ and other question-wordslike ‘why’, ‘what’, ‘when’, and‘how’.

One might also explain the synonymy of (18) and (19) by positing acommon deep structure, (18D).

(18)John seems to like Mary
(19)It seems John likes Mary 
(18D)[Seems{John [likes Mary]}]
(18S){Johni[seems { ( _ )i [to likeMary]}]}

If every English sentence needs a grammatical subject, (18D) mustbe modified: either by displacing ‘John’, as in (18S); orby inserting a pleonastic subject, as in (19). Note that in (19),‘It’ does not indicate an argument; compare‘There’ in ‘There is something in the garden’.Appeal to displacement also lets one distinguish the superficiallyparallel sentences (20) and (21).

(20)John is easy to please
(21)John is eager to please

If (20) is true, John is easily pleased. In which case, it is easy(for someone) to please John; where ‘it’ ispleonastic. But if (21) is true, John is eager that he please someoneor other. This asymmetry is effaced by representations like‘Easy-to-please(John)’ and‘Eager-to-please(John)’. The contrast is made manifest,however, with (20S) and (21S);

(20S){Johni [is easy { e [to please ( _)i ]}]}
(21S){Johni [is eager { ( _ )i [toplease e ]}]}

where ‘e’ indicates an unpronounced argumentposition. It may be that in (21S), which does not mean that it iseager for John to please someone, ‘John’ is grammaticallylinked but not actually displaced from the coindexed position. Butwhatever the details, the “surface subject” of a sentencecan be the object of a verb embedded within the main predicate, as in(20S). Of course, such hypotheses about grammatical structure requiredefense. But Chomsky and others have long argued that such hypothesesare needed to account for various facts concerning human linguisticcapacities; see, e.g., Berwick et.al. (2014). As an illustration ofthe kind of data that is relevant, note that (22–24) areperfectly fine expressions of English, while (25) is not.

(22)The boy who sang was happy
(23)Was the boy who sang happy
(24)The boy who was happy sang
(25)*Was the boy who happy sang

This suggests that the auxiliary verb ‘was’ can bedisplaced from some positions but not others. That is, while (22S) isa permissible transformation of (22D), (24S) is not a permissibletransformation of (24D).

(22D){[The [boy [who sang]]] [was happy]}
(22S)Wasi {[the [boy [who sang]]] [ ( _ )i happy]}
(24D){[The [boy [who [was happy]]]] sang}
(24S)*Wasi {[the [boy [who [ ( _ )i happy]]]] sang}

The ill-formedness of (25) is striking, since one can sensibly askwhether or not the boy who was happy sang. One can also ask whether ornot (26) is true. But (27) is not the yes/no question corresponding to(26).

(26)The boy who was lost kept crying
(27)Was the boy who lost kept crying

Rather, (27) is the yes/no question corresponding to ‘Theboy who lost was kept crying’, which has an unexpectedmeaning. So we want some account of why (27) cannot have theinterpretation corresponding to (26). But the “negativefact” concerning (27) is precisely what one would expect if‘was’ cannot be displaced from its position in (26).

*Wasi {[the [boy [who [( _)i lost]]]] [kept crying]}

By contrast, if we merely specify an algorithm that associates(27) with its actual meaning—or if we merely hypothesize that(27) is the English translation of a certain mental sentence—wehave not yet explained why (27) cannot also be used to ask whether ornot (26) is true. Explanations of such facts appeal to nonobviousgrammatical structure, and constraints on natural languagetransformations. (For example, an auxiliary verb in a relative clausecannot be “fronted;” though of course, theorists try tofind deeper explanations for such constraints.)

The idea was that a sentence has both a deep structure (DS), whichreflects semantically relevant relations between verbs and theirarguments, and a surface structure (SS) that may include displaced (orpleonastic) elements. In some cases, pronunciation might depend onfurther transformations of SS, resulting in a distinct“phonological form” (PF). Linguists posited variousconstraints on these levels of grammatical structure, and thetransformations that relate them. But as the theory was elaborated andrefined under empirical pressure, various facts that apparently calledfor explanation in these terms still went unexplained. This suggestedanother level of grammatical structure, perhaps obtained by adifferent kind of transformation on SS. The hypothesized level wascalled ‘LF’ (intimating ‘logical form’); andthe hypothesized transformation—called quantifier raisingbecause it targeted the kinds of expressions that indicate(restricted) quantifiers—mapped structures like (28S) ontostructures like (28L).

(28S){Juliet [likes [every doctor]]}
(28L){[every doctor]i {Juliet [likes ( _)i ]}}

Clearly, (28L) does not reflect the pronounced word order inEnglish. But the idea was that (PF) determines pronunciation, while LFwas said to be the level at which the scope of a natural languagequantifier is determined; see May (1985). If we think about‘every’ as a kind of second-order transitive predicate,which can combine with two predictes like ‘doctor’ and‘Juliet likes ( _ )i’ to form acomplete sentence, we should expect that at some level of analysis,the sentence ‘Juliet likes every doctor’ has the structureindicated in (28L). And mapping (28L) to the logical form‘[∀x:Doctor(x)]{Likes(Juliet, x)}’ istrivial. Similarly, if the surface structure (29S) can be mapped onto(29L) or (29L'),

(29S){[some boy] [likes [every doctor]]}
(29L){[some boy]i {[every doctor]jj {( _ )i [likes ( _ )j ]}}
(29L'){[every doctor]j {[some boy]i { ( _ )i[likes ( _ )j ]}}}

then (29S) can be mapped onto the logical forms‘[∃x:Boy(x)][∀y:Doctor(y)]{Likes(x, y)}’ and‘[∀y:Doctor(y)][∃x:Boy(x)]{Likes(x,y)}’. This assimilates quantifier scope ambiguity to thestructural ambiguity of examples like ‘Juliet saw the boy withbinoculars’. More generally, many apparent examples ofgrammar/logic mismatches were rediagnosed as mismatches betweendifferent aspects of grammatical structure—between those aspectsthat determine pronunication, and those that determine interpretation.In one sense, this is fully in keeping with the idea that in naturallanguage, “surface appearances” are often misleading withregard to propositional structure. But it also makes room for the ideathat grammatical structure and logical structure converge, in waysthat can be discovered through investigation, once we move beyondtraditional subject-predicate conceptions of structure with regard toboth logic and grammar.

There is independent evidence for “covert”transformations—displacement of expressions from their audiblepositions, as in (28L); see Huang (1995), Hornstein (1995). Consider,for example, the French translation of ‘Who did John see’:Jean a vu qui. If we assume that qui (‘who’) is displacedat  LF, then we can explain why the question-word is understoodin both French and English like a quantifier binding a variable: whichperson x is such that John saw x? Similarly, example (30) from Chineseis transliterated as in (31).

(30)Zhangsan zhidao Lisi mai-te sheme
(31)Zhangsan know Lisi bought what

But (30) is ambiguous, between the interrogative (31a) and thecomplex declarative (31b).

(31a)Which thing is such that Zhangsan knows Lisi bought it
(31b)Zhangsan knows which thing (is such that) Lisi bought (it)

This suggests covert displacement of the quantificationalquestion-word in Chinese; see Huang (1982, 1995). Chomsky (1981) alsoargued that the constraints on such displacement can help explaincontrasts like the one illustrated with (32) and (33).

(32)Who said he has the best smile 
(33)Who did he say has the best smile

In (32), the pronoun ‘he’ can have a bound-variablereading: which person x is such that x said that x has the bestsmile. This suggests that the following grammatical structure ispossible: Whoi {[(  )i said[hei has the best smile]]}. But (33) cannot be usedto ask this question, suggesting that some linguistic constraint rulesout the following structure:

*Whoi [did {[hei say[(  )i has the best smile]]].

And there cannot be constraints on transformations withouttransformations. So if English overtly displaces question-words thatare covertly displaced in other languages, we should not be surprisedif English covertly displaces other quantificational expressions like‘every doctor’. Likewise, (34) has the reading indicated in(34a) but not thereading indicated in (34b).

(34)It is false that Juliet likes every doctor
(34a)¬∀x:Doctor(x)[Likes(Juliet, x)]
(34b)∀x:Doctor(x)¬[Likes(Juliet, x)]

This suggests that ‘every doctor’ gets displaced, butonly so far. Similarly, (13) cannot mean that every doctor is suchthat no patient who saw that doctor is healthy.

(13)No patient who saw every doctor is healthy

As we have already seen, English seems to abhor fronting certainelements from within an embedded relative clause. This invites thehypothesis that quantifier raising is subject to a similar constraint,and hence, that there is quantifier-raising in English. Thishypothesis is controversial; see, e.g., Jacobson (1999). But manylinguists (following Chomsky [1995, 2000]) would now posit only twolevels of grammatical structure, corresponding to PF and LF—thethought being that constraints on DS and SS can be eschewed in favorof a simpler theory that only posits constraints on how expressionscan be combined in the course of constructing complex expressions thatcan be pronounced and interpreted. If this development of earliertheories proves correct, then the only semantically relevant level ofgrammatical structure often reflects covert displacement of audibleexpressions; see, e.g., Hornstein (1995). In any case, there is alarge body of work suggesting that many logical properties ofquantifiers, names, and pronouns are reflected in properties ofLF.

For example, if (35) is true, it follows that some doctor treated somedoctor; whereas (36) does not have this consequence:

(35)Every boy saw the doctor who treated himself
(36)Every boy saw the doctor who treated him

The truth conditions of (35–36) seem to be as indicated in(35a) and (36a).

(35a)[∀x:Boy(x)][!y:Doctor(y) & Treated(y,y)]{Saw(x, y)}
(36a)[∀x:boy(x)][!y:Doctor(y) & Treated(y,x)]{Saw(x, y)}

This suggests that ‘himself’ is behaving like avariable bound by ‘the doctor’, while ‘everyboy’ can bind ‘him’. And there are independentgrammatical reasons for saying that ‘himself’ must belinked to ‘the doctor’, while ‘him’ must notbe so linked. Note that in ‘Pat thinks Chris treatedhimself/him’, the antecedent of ‘himself’ must bethe subject of ‘treated’, while the antecedent of‘him’ must not be.

We still need to enforce the conceptual distinction between LF andthe traditional notion of logical form. There is no guarantee thatstructural features of natural language sentences will mirror thelogical features of propositions; cp. Stanley (2000), King (2007). Butthis leaves room for the empirical hypothesis that LF reflects atleast a great deal of propositional structure; see Harman (1972),Higginbotham (1986), Segal (1989), Larson and Ludlow (1993), and theessay onstructured propositions. Moreover, even if the LF of a sentence Sunderdetermines the logical form of the proposition a speakerexpresses with S (on a given occasion of use), the LF may provide a“scaffolding” that can be elaborated in particularcontexts, with little or no mismatch between grammatical andpropositional architecture. If some such view is correct, it mightavoid certain (unpleasant) questions prompted by earlier Fregeanviews: how can a sentence indicate a proposition with a differentstructure; and if grammar is deeply misleading, why think that ourintuitions concerning impeccability provide reliable evidence aboutwhich propositions follow from which? These are, however, issues thatremain unsettled.

9. Semantic Structure and Events

If propositions are the “things” that really havelogical form, and sentences of English are not themselvespropositions, then sentences of English “have” logicalforms only by association with propositions. But if the meaning of asentence is some proposition—or perhaps a function from contextsto propositions—then one might say that the logical form“of” a sentence is its semantic structure (i.e., thestructure of that sentence's meaning). Alternatively, one mightsuspect that in the end, talk of propositions is just convenientshorthand for talking about the semantic properties of sentences:perhaps sentences of a Begriffsschrift, or sentences of mentalese, orsentences of natural languages (abstracting away from theirlogically/semantically irrelevant properties). In any case, the notionof logical form has played a significant role in recent work ontheories of meaning for natural languages. So an introductorydiscussion of logical form would not be complete without some hint ofwhy such work is relevant, especially since attending to details ofnatural languages (as opposed to languages invented to study thefoundations of arithmetic) led to renewed discussion of how torepresent propositions that involve relations.

Prima facie, ‘Every old patient respects some doctor’and ‘Some young politician likes every liar’ exhibitcommon modes of linguistic combination. So a natural hypothesis isthat the meaning of each sentence is fixed by these modes ofcombination, given the relevant word meanings. It may be hard to seehow this hypothesis could be true if there are widespread mismatchesbetween logical and grammatical form. But it is also hard to see howthe hypothesis could be false. Children, who have finite cognitiveresources, typically acquire the capacity to understand the endlesslymany expressions of the languages spoken around them. A great deal ofrecent work has focussed on these issues, concering the connectionsbetween logical form and the senses in which natural languages aresemanticallycompositional.

It was implicit in Frege that each of the endlessly many sentencesof an ideal language would have a compositionally determinedtruth-condition. Frege did not actually specify an algorithm thatwould associate each sentence of his Begriffsschrift with itstruth-condition. ButTarski (1933) showed howto do this for the first-order predicate calculus, focussing oninteresting cases of multiple quantification like‘∀x[Number(x) → ∃y[SuccessorOf(y, x) &∀z[SuccessorOf(z, x) → (z = y)]]]’. This made itpossible to capture, with precision, the idea that an inference isvalid in the predicate calculus iff: every interpretation that makesthe premises true also makes the conclusion true, holding fixed theinterpretations of logical elements like ‘if’ and‘every’. Davidson (1967a) conjectured that one could dofor English what Tarski did for the predicate calculus; and Montague,similarly inspired by Tarski, showed how one could start dealing withpredicates that have quantificational constituents. Still, manyapparent objections to the conjecture remained. As noted at the end ofsection four, sentences like ‘Pat thinksthat Hesperus is Phosphorus’ present difficulties; thoughDavidson (1968) offered an influential suggestion. Davidson's (1967b)proposal concerning examples like (37–40) also proved enormouslyfruitful.

(37)Juliet kissed Romeo quickly at midnight.
(38)Juliet kissed Romeo quickly.
(39)Juliet kissed Romeo at midnight.
(40)Juliet kissed Romeo.

If (37) is true, so are (38–40); and if (38) or (39) istrue, so is (40). The inferences seem impeccable. But thefunction-argument structures are not obvious. If we represent‘kissed quickly at midnight’ as an unstructured predicatethat takes two arguments, like ‘kissed’ or‘kicked’, we will represent the inference from (37) to(40) as having the form: K*(x, y); so K(x, y). But this form isexemplified by the bad inference ‘Juliet kicked Romeo; so Julietkissed Romeo’. Put another way, if ‘kissed quickly atmidnight’ is a logically unstructured binary predicate, then thefollowing conditional is a nonlogical assumption: if Juliet kissedRomeo in a certain manner at a certain time, then Juliet kissedRomeo. But this conditional seems like a tautology, not an assumptionthat introduces any epistemic risk. Davidson concluded that thesurface appearances of sentences like (37–40) mask relevantsemantic structure. In particular, he proposed that such sentences areunderstood in terms of quantification over events.

According to Davdison, who echoed Ramsey (1927), the meaning of(40) is reflected in the paraphrase ‘There was a kissing ofRomeo by Juliet’. One can formalize this proposal in variousways: ∃e[KissingOf(e, Romeo) & KissingBy(e, Juliet)]; or∃e[Kiss(e, Juliet, Romeo)], with the verb ‘kiss’indicating a function that takes three arguments; or as in (40a),

(40a)∃e[Agent(e, Juliet) & Kissing(e) & Patient(e, Romeo)]

with Juliet and Romeo explicitly represented as players of certainroles in an event. But given any such representation, adverbs like‘quickly’ and ‘at midnight’ can be analyzed asadditional predicates of events, as shown in (37a-39a).

(37a)∃e[Agent(e, Juliet) &Kissing(e) & Patient(e, Romeo) &Quick(e) & At-midnight(e)]
(38a)∃e[Agent(e, Juliet) &Kissing(e) & Patient(e, Romeo) &Quick(e)]
(39a)∃e[Agent(e, Juliet) &Kissing(e) & Patient(e, Romeo) &At-midnight(e)]

If this is correct, then the inference from (37) to (40) is aninstance of the following valid form: ∃e[...e... & Q(e)& A(e))]; hence, ∃e[...e...]. The other impeccableinferences involving (37–40) can likewise be viewed as instancesof conjunction reduction. If the grammatical form of (40) is simply‘{Juliet [kissed Romeo]}’, then the mapping fromgrammatical to logical form is not transparent; and natural languageis misleading, in that no word corresponds to the eventquantifier. But this does not posit a significant structural mismatchbetween grammatical and logical form. On the contrary, each word in(40) corresponds to a conjunct in (40a). This suggests a strategy forthinking about how the meaning of a sentence like (40) might becomposed from the meanings of the constituent words. A growing body ofliterature, in philosophy and linguistics, suggests that Davidson'sproposal captures an important feature of natural language semantics,and that “event analyses” provide a useful framework forfuture discussions of logical form.

In one sense, it is an ancient idea that action reports like (40)represent individuals as participating in events; see Gillon's (2007)discussion of Panini's grammar of Sanskrit. But if (40) can be glossedas ‘Juliet did some kissing, and Romeo was therebykissed’, perhaps the ancient idea can be deployed in developingLeibniz' suggestion that relational sentences like (40) somehowcontain simpler active-voice and passive-voice sentences; cp. Kratzer(1996). And perhaps appeals to quantifier raising can help indefending the idea that ‘Juliet kissedsome/the/everyboy’ is, after all, a sentence that exhibitsSubject-copula-Predicate form: ‘[some/the/everyboy]i isP’, with‘P’ as a complex predicate akin to ‘[someevent]e was both a kissing done by Juliet and one inwhich hei was kissed’.

With this in mind, let's return to the idea that each complexexpression of natural language has semantic properties that aredetermined by (i) the semantic properties of its constituents, and(ii) the ways in which these constituents are grammaticallyarranged. If this is correct, then following Davidson, one might saythat the logical forms of expressions (of some natural language) justare the structures that determine the corresponding meanings given therelevant word meanings; see Lepore and Ludwig (2002). In which case,the phenomenon of valid inference may be largely a by-product ofsemantic compositionality. If principles governing the meanings of(37–40) have the consequence that (40) is true iff anexistential claim like (40a) is true, perhaps this is illustrative ofthe general case. Given a sentence of some natural language NL, thetask of specifying its logical form may be inseparable from the taskof providing a compositional specification of what the sentences of NLmean.

10. Further Questions

At this point, many issues become relevant to further discussionsof logical form. Most obviously, there are questions concerningparticular examples. Given just about any sentence of naturallanguage, one can ask interesting questions (that remain unsettled)about its logical form. There are also very abstract questions aboutthe relation of semantics to logic. Should we follow Davidson andMontague, among others, in characterizing theories of meaning fornatural languages as theories of truth (that perhaps satisfy certainconditions on learnability)? Is an algorithm that correctly associatessentences with truth-conditions (relative to contexts) necessaryand/or sufficient for being an adequate theory of meaning? What shouldwe say about the paradoxes apparently engendered by sentences like‘This sentence is false’? If we allow for second-orderlogical forms, how should we understand second-order quantification,given Russell's Paradox? Are claims about the “semanticstructure” of a sentence fundamentally descriptive claims aboutspeakers (or their communities, or their languages)? Or is there animportant sense in which claims about semantic structure are normativeclaims about how we should use language? Are facts about theacquisition of language germane to hypotheses about logical form? Andof course, the history of the subject reveals that the answers to thecentral questions are by no means obvious: what is logical structure,what is grammatical structure, and how are they related? Or putanother way, what kinds of structures do propositions and sentencesexhibit, and how do thinkers/speakers relate them?

Bibliography

Cited Works

  • Beaney, M., ed., 1997,The Frege Reader, Oxford:Blackwell.
  • Berwick, B. et al., 2011, “Poverty of the StimulusRevisited”,Cognitive Science, 35: 1207-42.
  • Boolos, G., 1998,Logic, Logic, and Logic, Cambridge, MA:Harvard University Press.
  • Carnap, R., 1950, “Empiricism, Semantics, andOntology”, reprinted in R. Carnap,Meaning andNecessity, second edition, Chicago: University of Chicago Press,1956.
  • Cartwright, R., 1962, “Propositions”, in R. J.Butler,Analytical Philosophy, 1st series, Oxford: Basil Blackwell1962; reprinted with addenda in Richard Cartwright,PhilosophicalEssays, Cambridge: MIT Press 1987.
  • Chomsky, N., 1957,Syntactic Structures, The Hague:Mouton.
  • –––, 1965,Aspects of the Theory ofSyntax, Cambridge, MA: MIT Press.
  • –––, 1981,Lectures on Government andBinding, Dordrecht: Foris.
  • –––, 1986,Knowledge of Language, New York:Praeger.
  • –––, 1995,The Minimalist Program, Cambridge,MA: MIT Press. 
  • Davidson, D., 1967a, “Truth and Meaning”,Synthese, 17: 304–23.
  • –––, 1967b, “The Logical Form of ActionSentences”, in N. Rescher (ed.),The Logic of Decision andAction, Pittsburgh: University of Pittsburgh Press.
  • –––, 1968, “On Saying That”,Synthese, 19: 130–46.
  • –––, 1980,Essays on Actions andEvents, Oxford: Oxford University Press.
  • –––, 1984,Inquiries into Truth andInterpretation, Oxford: Oxford University Press.
  • Donnellan, K., 1966, “Reference and DefiniteDescriptions”,Philosophical Review, 75:281–304.
  • Fodor, J., 1978, “Propositional Attitudes”,TheMonist, 61: 501–23.
  • Frege, G., 1879,Begriffsschirft, reprinted in Beaney1997.
  • –––, 1884,Die Grundlagen derArithmetik, Breslau: Wilhelm Koebner. English translation,The Foundations of Arithmetic, J. L. Austin (trans). Oxford:Basil Blackwell, 1974.
  • –––, 1891, “Function and Concept”,reprinted in Beaney 1997.
  • –––, 1892, “On Sinn and Bedeutung”,reprinted in Beaney 1997.
  • Gillon, B., 2007, “Pāṇini’sAṣṭādhyāyī and Linguistic Theory”,Journal of Indian Philosophy, 35: 445-468.
  • Harman, G., 1972, “Logical Form,”Foundations ofLanguage, 9: 38–65.
  • Harman, G., 1973,Thought, Princeton: PrincetonUniversity Press.
  • Higginbotham, J., 1986, “Linguistic Theory and Davidson'sProgram in Semantics”, in E. Lepore (ed.),Truth andInterpretation, Oxford: Blackwell, pp. 29–48.
  • Hornstein, N., 1995,Logical Form: From GB to Minimalism, Oxford:Blackwell.
  • Huang, J., 1995, “Logical Form”, in G. Webelhuth(ed.),Government and Binding Theory and the Minimalist Program:Principles and Parameters in Syntactic Theory, Oxford:Blackwell.
  • Jacobson, P., 1999. “Variable Free Semantics”,Linguistics and Philosophy, 22: 117–84.
  • King, J., 2002. “Two Sorts of Claims about Logical Form” in Preyer and Peter (2002).
  • –––,The Nature and Structure ofContent, Oxford: Oxford University Press.
  • Kratzer, A., 1986, Severing the External Argument from its Verb.In J. Rooryck and L. Zaring (eds.),Phrase Structure and theLexicon, Dordrecht: Kluwer.
  • Larson, R. and Ludlow, P., 1993, “Interpreted LogicalForms,”Synthese, 95: 305–55.
  • Lepore, E. and Ludwig, K., 2002, “What is LogicalForm?”, in Preyer and Peter 2002, pp. 54–90.
  • Ludlow, P., 2002, “LF and Natural Logic”, in Preyerand Peter 2002.
  • May, R., 1985,Logical Form: Its Structure and Derivation,Cambridge, MA: MIT Press.
  • Montague, R., 1970, “English as a Formal Language”,R. Thomason (ed.),Formal Philosophy, NewHaven: Yale University Press, 1974.
  • Parsons, T., 2014,Articulating Medieval Logic, Oxford: OxfordUniversity Press.
  • Preyer, G. and G. Peter, G. (eds.), 2002,Logical Form andLanguage, Oxford: Oxford University Press.
  • Quine, W.V.O., 1950,Methods of Logic, New York: Henry Holt.
  • –––, 1951, “Two Dogmas ofEmpiricism”,Philosophical Review, 60:20–43.
  • –––, 1960,Word and Object, Cambridge MA: MITPress.
  • –––, 1970,Philosophy of Logic, EnglewoodCliffs, NJ: Prentice Hall.
  • Ramsey, F., 1927, “Facts and Propositions”,Proceedings of the Aristotelian Society (SupplementaryVolume), 7: 153–170.
  • Sànchez, V., 1991,Studies on Natural Logic andCategorial Grammar, Ph.D. Thesis, University of Amsterdam.
  • Segal, G., 1989, “A Preference for Sense andReference,”The Journal of Philosophy, 86: 73–89.
  • Soames, S., 1987, “Direct Reference, PropositionalAttititudes, and Semantic Content”,PhilosophicalTopics, 15: 47–87.
  • –––, 1995, “Beyond SingularPropositions”,Canadian Journal of Philosophy, 25:515–50.
  • –––, 2002,Beyond Rigidity, Oxford:Oxford University Press.
  • Sommers, F., 1984,The Logic of Natural Language, Oxford: OxfordUniversity Press.
  • Stanley, J., 2000, “Context and Logical Form”,Linguistics and Philosophy, 23: 391–434.
  • Strawson, P., 1950, “On Referring”,Mind, 59:320–44.
  • Tarski, A., 1933, “The Concept of Truth in FormalizedLanguages”, reprinted in Tarski 1983.
  • –––, 1944, “The Semantic Conception ofTruth”,Philosophy and Phenomenological Research, 4:341–75.
  • –––, 1983,Logic, Semantics,Metamathematics, J. Corcoran (ed.), J.H. Woodger (trans.), 2ndedition, Indianapolis: Hackett.

Some Other Useful Works

A few helpful overviews of the history and basic subject matter oflogic:

  • Kneale, W. & Kneale, M., 1962,The Development ofLogic, Oxford: Oxford University Press, reprinted 1984.
  • Sainsbury, M., 1991,Logical Forms, Oxford:Blackwell.
  • Broadie, A., 1987,Introduction to Medieval Logic,Oxford: Oxford University Press.
  • For these purposes, Russell's most important books are:Introduction to Mathematical Philosophy, London: George Allen andUnwin, 1919; Our Knowledge of the External World, New York: Norton,1929; and The Philosophy of Logical Atomism, La Salle, Ill: OpenCourt, 1985. Stephen Neale's book Descriptions (Cambridge, MA: MITPress, 1990) is a recent development of Russell's theory.

Two key articles on restricted quantifiers, and a third reviewingmore recent work, are:

  • Barwise, J. & Cooper, R., 1981, “Generalized Quantifiersand Natural Language”,Linguistics and Philosophy, 4:159–219.
  • Higginbotham, J. & May, R., 1981, “Questions,Quantifiers, and Crossing”,Linguistic Review, 1:47–79.
  • Keenan, E., 1996, “The Semantics of Determiners”, inS. Lappin (ed.),The Handbook of Contemporary SemanticTheory, Oxford: Blackwell.

For introductions to Transformational Grammar and Chomsky'sconception of natural language:

  • Radford, A., 1988,Transformational Grammar, Cambridge:Cambridge University Press.
  • Haegeman, L., 1994,Introduction to Government & BindingTheory, Oxford: Blackwell.
  • Lasnik, H. (with M. Depiante and A. Stepanov), 2000,SyntacticStructures Revisited, Cambridge, MA: MIT Press.

For discussions of work in linguistics bearing directly on issuesof logical form:

  • Higginbotham, J., 1985, “On Semantics”,LinguisticInquiry, 16: 547–93.
  • Hornstein, N., 1995,Logical Form: From GB to Minimalism,Oxford: Blackwell.
  • Larson, R. and Segal, G., 1995,Knowledge of Meaning,Cambridge, MA: MIT Press.\
  • May, R., 1985,Logical Form: Its Structure and Derivation,Cambridge, MA: MIT Press.
  • Neale, S., 1993,Grammatical Form, Logical Form, and IncompleteSymbols, in A. Irvine & G. Wedeking (eds.),Russell and AnalyticPhilosophy, Toronto: University of Toronto.

For discussions of the Davidsonian program (briefly described insection 9) and appeal to events:

  • Davidson, D., 1984,Essays on Truth and Interpretation,Oxford: OUP.
  • Davidson, D., 1985, “Adverbs of Action”, inB. Vermazen and M. Hintikka (eds.),Essays on Davidson: Actions andEvents, Oxford: Clarendon Press
  • Evans, G. & McDowell, J. (eds.), 1976,Truth andMeaning, Oxford: Oxford University Press.
  • Higginbotham, J., Pianesi, F. and Varzi, A. (eds.), 2000,Speaking of Events, Oxford: Oxford University Press.
  • Ludwig, K. (ed.), 2003,Contemporary Philosophers in Focus: DonaldDavidson, Cambridge: Cambridge University Pres
  • Lycan, W., 1984,Logical Form in Natural Language,Cambridge, MA: MIT Press.
  • Parsons, T., 1990,Events in the Semantics of English Cambridge,MA: MIT Press.
  • Pietroski, P., 2005,Events and Semantic Architecture, Oxford:Oxford University Press.
  • Schein, B., 1993,Plurals, Cambridge, MA: MIT Press.
  • Taylor, B., 1985,Modes of Occurrence, Oxford: Blackwell.

Other Internet Resources

[Please contact the author with suggestions.]

Acknowledgments

The author would like to thank: Christopher Menzel for spotting anerror in an earlier characterization of the generalized quantifier‘every’, prompting revision of the surrounding discussion;Karen Carter and Max Heiber, for catching unfortunate typos earlierversions of sections three and six; and for comments on earlierdrafts, Susan Dwyer, James Lesher, the editors and referees.

Copyright © 2015 by
Paul Pietroski<pietro@umd.edu>

This is a file in the archives of the Stanford Encyclopedia of Philosophy.
Please note that some links may no longer be functional.

Browse

About

Support SEP

Mirror Sites

View this site from another server:

USA (Main Site)CSLI, Stanford University

Stanford Center for the Study of Language and Information

The Stanford Encyclopedia of Philosophy iscopyright © 2015 byThe Metaphysics Research Lab, Center for the Study of Language and Information (CSLI), Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

[an error occurred while processing the directive]
[8]ページ先頭

©2009-2025 Movatter.jp