HOME

TheInfoList



OR:

Generative grammar is a research tradition in
linguistics Linguistics is the scientific study of language. The areas of linguistic analysis are syntax (rules governing the structure of sentences), semantics (meaning), Morphology (linguistics), morphology (structure of words), phonetics (speech sounds ...
that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists (), tend to share certain working assumptions such as the
competence Broad concept article: *Competence (polyseme), capacity or ability to perform effectively Competence or competency may also refer to: *Competence (human resources), ability of a person to do a job properly **Competence-based management, performa ...
performance A performance is an act or process of staging or presenting a play, concert, or other form of entertainment. It is also defined as the action or process of carrying out or accomplishing an action, task, or function. Performance has evolved glo ...
distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as
syntax In linguistics, syntax ( ) is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituenc ...
,
semantics Semantics is the study of linguistic Meaning (philosophy), meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction betwee ...
,
phonology Phonology (formerly also phonemics or phonematics: "phonemics ''n.'' 'obsolescent''1. Any procedure for identifying the phonemes of a language from a corpus of data. 2. (formerly also phonematics) A former synonym for phonology, often pre ...
,
psycholinguistics Psycholinguistics or psychology of language is the study of the interrelation between linguistic factors and psychological aspects. The discipline is mainly concerned with the mechanisms by which language is processed and represented in the mind ...
, and
language acquisition Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language. In other words, it is how human beings gain the ability to be aware of language, to understand it, and to produce and use words and s ...
, with additional extensions to topics including biolinguistics and music cognition. Generative grammar began in the late 1950s with the work of
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American professor and public intellectual known for his work in linguistics, political activism, and social criticism. Sometimes called "the father of modern linguistics", Chomsky is also a ...
, having roots in earlier approaches such as structural linguistics. The earliest version of Chomsky's model was called Transformational grammar, with subsequent iterations known as Government and binding theory and the
Minimalist program In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky. Following Imre Lakatos's distinction, Chomsky presents minima ...
. Other present-day generative models include Optimality theory, Categorial grammar, and Tree-adjoining grammar.


Principles

Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.}


Cognitive science

Generative grammar studies language as part of
cognitive science Cognitive science is the interdisciplinary, scientific study of the mind and its processes. It examines the nature, the tasks, and the functions of cognition (in a broad sense). Mental faculties of concern to cognitive scientists include percep ...
. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language. Like other approaches in linguistics, generative grammar engages in linguistic description rather than
linguistic prescription Linguistic prescription is the establishment of rules defining publicly preferred Usage (language), usage of language, including rules of spelling, pronunciation, vocabulary, grammar, etc. Linguistic prescriptivism may aim to establish a standard ...
.


Explicitness and generality

Generative grammar proposes models of language consisting of explicit rule systems, which make testable falsifiable predictions. This is different from
traditional grammar Traditional grammar (also known as classical grammar) is a framework for the description of the structure of a language or group of languages. The roots of traditional grammar are in the work of classical Greek and Latin philologists. The forma ...
where grammatical patterns are often described more loosely. These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. For example, because English imperative tag questions obey the same restrictions that second person
future The future is the time after the past and present. Its arrival is considered inevitable due to the existence of time and the laws of physics. Due to the apparent nature of reality and the unavoidability of the future, everything that currently ex ...
declarative tags do,
Paul Postal Paul Martin Postal (born November 10, 1936, in Weehawken, New Jersey) is an American linguist. Biography Postal received his PhD from Yale University Yale University is a Private university, private Ivy League research university in New ...
proposed that the two constructions are derived from the same underlying structure. By adopting this hypothesis, he was able to capture the restrictions on tags with a single rule. This kind of reasoning is commonplace in generative research. Particular theories within generative grammar have been expressed using a variety of formal systems, many of which are modifications or extensions of context free grammars.


Competence versus performance

Generative grammar generally distinguishes linguistic competence and linguistic performance.} Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use. This distinction is related to the broader notion of Marr's levels used in other cognitive sciences, with competence corresponding to Marr's computational level. For example, generative theories generally provide competence-based explanations for why English speakers would judge the sentence in (1) as odd. In these explanations, the sentence would be ungrammatical because the rules of English only generate sentences where
demonstrative Demonstratives (list of glossing abbreviations, abbreviated ) are words, such as ''this'' and ''that'', used to indicate which entities are being referred to and to distinguish those entities from others. They are typically deictic, their meaning ...
s agree with the
grammatical number In linguistics, grammatical number is a Feature (linguistics), feature of nouns, pronouns, adjectives and verb agreement (linguistics), agreement that expresses count distinctions (such as "one", "two" or "three or more"). English and many other ...
of their associated
noun In grammar, a noun is a word that represents a concrete or abstract thing, like living creatures, places, actions, qualities, states of existence, and ideas. A noun may serve as an Object (grammar), object or Subject (grammar), subject within a p ...
. :(1) *That cats is eating the mouse. By contrast, generative theories generally provide performance-based explanations for the oddness of center embedding sentences like one in (2). According to such explanations, the grammar of English could in principle generate such sentences, but doing so in practice is so taxing on working memory that the sentence ends up being unparsable. :(2) *The cat that the dog that the man fed chased meowed. In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence. For example, while many generative models of syntax explain island effects by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance. Non-generative approaches often do not posit any distinction between competence and performance. For instance, usage-based models of language assume that grammatical patterns arise as the result of usage.


Innateness and universality

A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some domain-specific aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.} The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments. For example, one famous poverty of the stimulus argument concerns the acquisition of yes–no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting
hierarchical structure A hierarchy (from Greek: , from , 'president of sacred rites') is an arrangement of items (objects, names, values, categories, etc.) that are represented as being "above", "below", or "at the same level as" one another. Hierarchy is an importan ...
even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are. The empirical basis of poverty of the stimulus arguments has been challenged by Geoffrey Pullum and others, leading to back-and-forth debate in the
language acquisition Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language. In other words, it is how human beings gain the ability to be aware of language, to understand it, and to produce and use words and s ...
literature. Recent work has also suggested that some recurrent neural network architectures are able to learn hierarchical structure without an explicit constraint. Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different
feature Feature may refer to: Computing * Feature recognition, could be a hole, pocket, or notch * Feature (computer vision), could be an edge, corner or blob * Feature (machine learning), in statistics: individual measurable properties of the phenome ...
-specifications in the
lexicon A lexicon (plural: lexicons, rarely lexica) is the vocabulary of a language or branch of knowledge (such as nautical or medical). In linguistics, a lexicon is a language's inventory of lexemes. The word ''lexicon'' derives from Greek word () ...
. On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked. In a 2002 paper,
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American professor and public intellectual known for his work in linguistics, political activism, and social criticism. Sometimes called "the father of modern linguistics", Chomsky is also a ...
, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure. In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.


Subfields

Research in generative grammar spans a number of subfields. These subfields are also studied in non-generative approaches.


Syntax

Syntax studies the rule systems which combine smaller units such as morphemes into larger units such as
phrase In grammar, a phrasecalled expression in some contextsis a group of words or singular word acting as a grammatical unit. For instance, the English language, English expression "the very happy squirrel" is a noun phrase which contains the adject ...
s and sentences. Within generative syntax, prominent approaches include Minimalism, Government and binding theory, Lexical-functional grammar (LFG), and
Head-driven phrase structure grammar Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor t ...
(HPSG).


Phonology

Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work on phonotactic rules which govern which phonemes can be combined, as well as those that determine the placement of stress, tone, and other suprasegmental elements. Within generative grammar, a prominent approach to phonology is Optimality Theory.


Semantics

Semantics studies the rule systems that determine expressions' meanings. Within generative grammar, semantics is a species of formal semantics, providing compositional models of how the
denotation In linguistics and philosophy, the denotation of a word or expression is its strictly literal meaning. For instance, the English word "warm" denotes the property of having high temperature. Denotation is contrasted with other aspects of meaning in ...
s of sentences are computed on the basis of the meanings of the individual
morpheme A morpheme is any of the smallest meaningful constituents within a linguistic expression and particularly within a word. Many words are themselves standalone morphemes, while other words contain multiple morphemes; in linguistic terminology, this ...
s and their syntactic structure.


Extensions


Music

Generative grammar has been applied to
music theory Music theory is the study of theoretical frameworks for understanding the practices and possibilities of music. ''The Oxford Companion to Music'' describes three interrelated uses of the term "music theory": The first is the "Elements of music, ...
and
analysis Analysis (: analyses) is the process of breaking a complex topic or substance into smaller parts in order to gain a better understanding of it. The technique has been applied in the study of mathematics and logic since before Aristotle (38 ...
since the 1980s. One notable approach is
Fred Lerdahl Alfred Whitford (Fred) Lerdahl (born March 10, 1943) is an American music theorist and composer. Best known for his work on musical grammar, Music cognition, cognition, Rhythm, rhythmic theory, and pitch space, he and the linguist Ray Jackendoff d ...
and Ray Jackendoff's Generative theory of tonal music, which formalized and extended ideas from Schenkerian analysis.


Biolinguistics

Recent work in generative-inspired biolinguistics has proposed that universal grammar consists solely of syntactic
recursion Recursion occurs when the definition of a concept or process depends on a simpler or previous version of itself. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in m ...
, and that it arose recently in humans as the result of a random genetic mutation. Generative-inspired biolinguistics has not uncovered any particular genes responsible for language. While some prospects were raised at the discovery of the '' FOXP2''
gene In biology, the word gene has two meanings. The Mendelian gene is a basic unit of heredity. The molecular gene is a sequence of nucleotides in DNA that is transcribed to produce a functional RNA. There are two types of molecular genes: protei ...
, there is not enough support for the idea that it is 'the grammar gene' or that it had much to do with the relatively recent emergence of syntactical speech.


History

Analytical models based on semantics and
discourse Discourse is a generalization of the notion of a conversation to any form of communication. Discourse is a major topic in social theory, with work spanning fields such as sociology, anthropology, continental philosophy, and discourse analysis. F ...
pragmatics In linguistics and the philosophy of language, pragmatics is the study of how Context (linguistics), context contributes to meaning. The field of study evaluates how human language is utilized in social interactions, as well as the relationship ...
were rejected by the Bloomfieldian school of linguistics whose derivatives place the object into the
verb phrase In linguistics, a verb phrase (VP) is a syntax, syntactic unit composed of a verb and its argument (linguistics), arguments except the subject (grammar), subject of an independent clause or coordinate clause. Thus, in the sentence ''A fat man quic ...
, following from Wilhelm Wundt's Völkerpsychologie. Formalisms based on this convention were constructed in the 1950s by Zellig Harris and Charles Hockett. These gave rise to modern generative grammar. As a distinct research tradition, generative grammar began in the late 1950s with the work of
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American professor and public intellectual known for his work in linguistics, political activism, and social criticism. Sometimes called "the father of modern linguistics", Chomsky is also a ...
. However, its roots include earlier structuralist approaches such as glossematics which themselves had older roots, for instance in the work of the ancient Indian grammarian
Pāṇini (; , ) was a Sanskrit grammarian, logician, philologist, and revered scholar in ancient India during the mid-1st millennium BCE, dated variously by most scholars between the 6th–5th and 4th century BCE. The historical facts of his life ar ...
. Military funding to generative research was an important factor in its early spread in the 1960s. The initial version of generative syntax was called transformational grammar. In transformational grammar, rules called transformations mapped a level of representation called deep structures to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room". After the Linguistics wars of the late 1960s and early 1970s, Chomsky developed a revised model of syntax called Government and binding theory, which eventually grew into Minimalism. In the aftermath of those disputes, a variety of other generative models of syntax were proposed including relational grammar, Lexical-functional grammar (LFG), and
Head-driven phrase structure grammar Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor t ...
(HPSG). Generative phonology originally focused on rewrite rules, in a system commonly known as ''SPE Phonology'' after the 1968 book The Sound Pattern of English by Chomsky and Morris Halle. In the 1990s, this approach was largely replaced by Optimality theory, which was able to capture generalizations called conspiracies which needed to be stipulated in SPE phonology. Semantics emerged as a subfield of generative linguistics during the late 1970s, with the pioneering work of Richard Montague. Montague proposed a system called Montague grammar which consisted of interpretation rules mapping expressions from a bespoke model of syntax to formulas of intensional logic. Subsequent work by Barbara Partee, Irene Heim, Tanya Reinhart, and others showed that the key insights of Montague Grammar could be incorporated into more syntactically plausible systems.


See also

* Cognitive linguistics * Cognitive revolution * Digital infinity *
Formal grammar A formal grammar is a set of Terminal and nonterminal symbols, symbols and the Production (computer science), production rules for rewriting some of them into every possible string of a formal language over an Alphabet (formal languages), alphabe ...
* Functional theories of grammar * Generative lexicon * Generative metrics * Generative principle * Generative semantics * Generative systems *
Parsing Parsing, syntax analysis, or syntactic analysis is a process of analyzing a String (computer science), string of Symbol (formal), symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal gramm ...
* Phrase structure rules * '' Syntactic Structures''


References


Further reading

*Chomsky, Noam. 1965. Aspects of the theory of syntax. Cambridge, Massachusetts: MIT Press. *Hurford, J. (1990) ''Nativist and functional explanations in language acquisition''. In I. M. Roca (ed.), Logical Issues in Language Acquisition, 85–136. Foris, Dordrecht. * *


External links

* {{Authority control Cognitive musicology Grammar Grammar frameworks Mathematical linguistics Noam Chomsky