"Standard Theory" redirects here. For the theory of Ancient Egyptian verbal syntax, seeStandard Theory (Egyptology).
Asyntax tree in which the sentence (S) breaks down into a noun phrase (NP) and a verb phrase (VP), both of which break down into additional smaller constituents.
Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.[1][2]
Generative grammar studies language as part ofcognitive science. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.[3][4][5]
Generative grammar proposes models of language consisting of explicit rule systems, which make testablefalsifiable predictions. This is different fromtraditional grammar where grammatical patterns are often described more loosely.[8][9] These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. As a result, empirical research in generative linguistics often seeks to identify commonalities between phenomena, and theoretical research seems to provide them with unified explanations. For example,Paul Postal observed that Englishimperativetag questions obey the same restrictions that second personfuturedeclarative tags do, and proposed that the two constructions are derived from the same underlying structure. This hypothesis was able to explain the restrictions on tags using a single rule.[8]
Particular theories within generative grammar have been expressed using a variety offormal systems, many of which are modifications or extensions ofcontext free grammars.[8]
Generative grammar generally distinguisheslinguistic competence andlinguistic performance.[10] Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use.[10][11] This distinction is related to the broader notion ofMarr's levels used in other cognitive sciences, with competence corresponding to Marr's computational level.[12]
For example, generative theories generally provide competence-based explanations for whyEnglish speakers would judge the sentence in (1) asodd. In these explanations, the sentence would beungrammatical because the rules of English only generate sentences wheredemonstrativesagree with thegrammatical number of their associatednoun.[13]
(1) *That cats is eating the mouse.
By contrast, generative theories generally provide performance-based explanations for the oddness ofcenter embedding sentences like one in (2). According to such explanations, the grammar of English could in principle generate such sentences, but doing so in practice is so taxing onworking memory that the sentence ends up beingunparsable.[13][14]
(2) *The cat that the dog that the man fed chased meowed.
In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence.[14][15] For example, while many generative models of syntax explainisland effects by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.[16][17]
Non-generative approaches often do not posit any distinction between competence and performance. For instance,usage-based models of language assume that grammatical patterns arise as the result of usage.[18]
A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least somedomain-specific aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.[19][20]
The idea that at least some aspects are innate is motivated bypoverty of the stimulus arguments.[21][22] For example, one famous poverty of the stimulus argument concerns the acquisition ofyes–no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targetinghierarchical structure even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrangeconstituents intree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.[21][22][23] The empirical basis of poverty of the stimulus arguments has been challenged byGeoffrey Pullum and others, leading to back-and-forth debate in thelanguage acquisition literature.[24][25] Recent work has also suggested that somerecurrent neural network architectures are able to learn hierarchical structure without an explicit constraint.[26]
Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed byHagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from differentfeature-specifications in thelexicon.[20][27] On the other hand, a strong hypothesis adopted in some variants ofOptimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.[20][28] In a 2002 paper,Noam Chomsky,Marc Hauser andW. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.[29]
In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.[19]
Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work onphonotactic rules which govern whichphonemes can be combined, as well as those that determine the placement ofstress,tone, and othersuprasegmental elements.[31] Within generative grammar, a prominent approach to phonology isOptimality Theory.[28]
Semantics studies the rule systems that determine expressions' meanings. Within generative grammar, semantics is a species offormal semantics, providingcompositional models of how thedenotations of sentences are computed on the basis of the meanings of the individualmorphemes and their syntactic structure.[32]
Generative grammar has been applied tomusic theory andanalysis.[33] One notable approach isFred Lerdahl andRay Jackendoff's 1983Generative theory of tonal music, which formalized and extended ideas fromSchenkerian analysis.[34] Though Lerdahl and Jackendoff observed that their model of musical syntax was quite different from then-current models of linguistic syntax, recent work by Jonah Katz andDavid Pesetsky has argued for a closer connection. According to Katz and Pesetsky'sIdentity Thesis for Language and Music, linguistic syntax and musical syntax are formally equivalent except that the former operates on morphemes while the latter operates onpitch classes.[35][36]
Biolinguistics is the study of the biology of language. Recent work in generative-inspired biolinguistics has proposed the hypothesis that universal grammar consists solely of syntacticrecursion, and that it arose recently in humans as the result of a random genetic mutation.[37]
As a distinct research tradition, generative grammar began in the late 1950s with the work ofNoam Chomsky.[38] However, its roots include earlierstructuralist approaches such asglossematics which themselves had older roots, for instance in the work of the ancient Indian grammarianPāṇini.[39][40][41] Military funding to generative research was an important factor in its early spread in the 1960s.[42]
The initial version of generative syntax was calledtransformational grammar. In transformational grammar, rules called transformations mapped a level of representation calleddeep structures to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room".[43]
Generative phonology originally focused onrewrite rules, in a system commonly known asSPE Phonology after the 1968 bookThe Sound Pattern of English by Chomsky andMorris Halle. In the 1990s, this approach was largely replaced byOptimality theory, which was able to capture generalizations calledconspiracies which needed to be stipulated in SPE phonology.[28]
Semantics emerged as a subfield of generative linguistics during the late 1970s, with the pioneering work ofRichard Montague. Montague proposed a system calledMontague grammar which consisted of interpretation rules mapping expressions from a bespoke model of syntax to formulas ofintensional logic. Subsequent work byBarbara Partee,Irene Heim,Tanya Reinhart, and others showed that the key insights of Montague Grammar could be incorporated into more syntactically plausible systems.[45][46]
^Wasow, Thomas (2003)."Generative Grammar"(PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.).The Handbook of Linguistics. Blackwell. pp. 296, 311.doi:10.1002/9780470756409.ch12....generative grammar is not so much a theory as a family or theories, or a school of thought... [having] shared assumptions and goals, widely used formal devices, and generally accepted empirical results
^Sprouse, Jon; Wagers, Matt;Phillips, Colin (2013). "Deriving competing predictions from grammatical approaches and reductionist approaches to island effects". In Sprouse, Jon;Hornstein, Norbert (eds.).Experimental syntax and island effects. Cambridge University Press.doi:10.1017/CBO9781139035309.002 (inactive 1 July 2025).{{cite encyclopedia}}: CS1 maint: DOI inactive as of July 2025 (link)
^Hofmeister, Philip; Staum Casasanto, Laura;Sag, Ivan (2013). "Islands in the grammar? Standards of evidence". In Sprouse, Jon; Hornstein, Norbert (eds.).Experimental syntax and island effects. Cambridge University Press. pp. 42–63.doi:10.1017/CBO9781139035309.004.ISBN978-1-139-03530-9.
^Vyvyan, Evans; Green, Melanie (2006).Cognitive Linguistics: An Introduction. Edinburgh University Press. pp. 108–111.ISBN0-7486-1832-5.
^Gallego, Ángel (2012). "Parameters". In Boeckx, Cedric (ed.).The Oxford Handbook of Linguistic Minimalism. Oxford University Press.doi:10.1093/oxfordhb/9780199549368.013.0023.
^Zeijlstra, Hedde (2020)."Rethinking remerge: Merge, movement and music"(PDF). In Bárány, András; Biberauer, Theresa; Douglas, Jamie; Vikner, Sten (eds.).Syntactic architecture and its consequences II: Between syntax and morphology. Language Science Press.
^Newmeyer, Frederick (1986).Linguistic Theory in America. Academic Press. pp. 17–18.ISBN0-12-517152-8.
^Koerner, E. F. K. (1978). "Towards a historiography of linguistics".Toward a Historiography of Linguistics: Selected Essays. John Benjamins. pp. 21–54.
^Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88
^Partee, Barbara (2011). "Formal semantics: Origins, issues, early impact".The Baltic International Yearbook of Cognition, Logic and Communication.6.CiteSeerX10.1.1.826.5720.
Chomsky, Noam. 1965. Aspects of the theory of syntax. Cambridge, Massachusetts: MIT Press.
Hurford, J. (1990)Nativist and functional explanations in language acquisition. In I. M. Roca (ed.), Logical Issues in Language Acquisition, 85–136. Foris, Dordrecht.
Each category of languages, except those marked by a*, is aproper subset of the category directly above it.Any language in each category is generated by a grammar and by an automaton in the category in the same line.