HOME

TheInfoList



OR:

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation (as opposed to the ''constituency relation'' of
phrase structure Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural langu ...
) and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called ''dependencies''. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word (a
head A head is the part of an organism which usually includes the ears, brain, forehead, cheeks, chin, eyes, nose, and mouth, each of which aid in various sensory functions such as sight, hearing, smell, and taste. Some very simple animals ...
) and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase
constituent Constituent or constituency may refer to: Politics * An individual voter within an electoral district, state, community, or organization * Advocacy group or constituency * Constituent assembly * Constituencies of Namibia Other meanings * Consti ...
, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.


History

The notion of dependencies between grammatical units has existed since the earliest recorded grammars, e.g.
Pāṇini , era = ;;6th–5th century BCE , region = Indian philosophy , main_interests = Grammar, linguistics , notable_works = ' ( Classical Sanskrit) , influenced= , notable_ideas= Descriptive linguistics (Devana ...
, and the dependency concept therefore arguably predates that of phrase structure by many centuries. Ibn Maḍāʾ, a 12th-century
linguist Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Lingu ...
from Córdoba, Andalusia, may have been the first grammarian to use the term ''dependency'' in the grammatical sense that we use it today. In early modern times, the dependency concept seems to have coexisted side by side with that of phrase structure, the latter having entered Latin, French, English and other grammars from the widespread study of term logic of antiquity. Dependency is also concretely present in the works of Sámuel Brassai (1800–1897), a Hungarian linguist, Franz Kern (1830-1894), a German philologist, and of
Heimann Hariton Tiktin Heimann Hariton Tiktin (August 9, 1850 – March 13, 1936), born Heimann Tiktin, was a Silesian-born Romanian linguist and academic, one of the founders of modern Romanian linguistics. Biography Born in Breslau (part of Prussia at the time), ...
(1850–1936), a Romanian linguist. Modern dependency grammars, however, begin primarily with the work of Lucien Tesnière. Tesnière was a Frenchman, a polyglot, and a professor of linguistics at the universities in Strasbourg and Montpellier. His major work ''Éléments de syntaxe structurale'' was published posthumously in 1959 – he died in 1954. The basic approach to syntax he developed seems to have been seized upon independently by others in the 1960s and a number of other dependency-based grammars have gained prominence since those early works. DG has generated a lot of interest in Germany in both theoretical syntax and language pedagogy. In recent years, the great development surrounding dependency-based theories has come from computational linguistics and is due, in part, to the influential work that David Hays did in machine translation at the RAND Corporation in the 1950s and 1960s. Dependency-based systems are increasingly being used to parse natural language and generate tree banks. Interest in dependency grammar is growing at present, international conferences on dependency linguistics being a relatively recent development
Depling 2011Depling 2013Depling 2015Depling 2019
.


Dependency vs. phrase structure

Dependency is a one-to-one correspondence: for every element (e.g. word or morph) in the sentence, there is exactly one node in the structure of that sentence that corresponds to that element. The result of this one-to-one correspondence is that dependency grammars are word (or morph) grammars. All that exist are the elements and the dependencies that connect the elements into a structure. This situation should be compared with
phrase structure Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural langu ...
. Phrase structure is a one-to-one-or-more correspondence, which means that, for every element in a sentence, there is one or more nodes in the structure that correspond to that element. The result of this difference is that dependency structures are minimal compared to their phrase structure counterparts, since they tend to contain many fewer nodes. :::: These trees illustrate two possible ways to render the dependency and phrase structure relations (see below). This dependency tree is an "ordered" tree, i.e. it reflects actual word order. Many dependency trees abstract away from linear order and focus just on hierarchical order, which means they do not show actual word order. This constituency (= phrase structure) tree follows the conventions of
bare phrase structure In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky. Following Imre Lakatos's distinction, Chomsky presents mi ...
(BPS), whereby the words themselves are employed as the node labels. The distinction between dependency and phrase structure grammars derives in large part from the initial division of the clause. The phrase structure relation derives from an initial binary division, whereby the clause is split into a subject noun phrase (NP) and a
predicate Predicate or predication may refer to: * Predicate (grammar), in linguistics * Predication (philosophy) * several closely related uses in mathematics and formal logic: **Predicate (mathematical logic) **Propositional function **Finitary relation, o ...
verb phrase (VP). This division is certainly present in the basic analysis of the clause that we find in the works of, for instance,
Leonard Bloomfield Leonard Bloomfield (April 1, 1887 – April 18, 1949) was an American linguist who led the development of structural linguistics in the United States during the 1930s and the 1940s. He is considered to be the father of American distributionalis ...
and
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American public intellectual: a linguist, philosopher, cognitive scientist, historian, social critic, and political activist. Sometimes called "the father of modern linguistics", Chomsky i ...
. Tesnière, however, argued vehemently against this binary division, preferring instead to position the verb as the root of all clause structure. Tesnière's stance was that the subject-predicate division stems from term logic and has no place in linguistics. The importance of this distinction is that if one acknowledges the initial subject-predicate division in syntax is real, then one is likely to go down the path of phrase structure grammar, while if one rejects this division, then one must consider the verb as the root of all structure, and so go down the path of dependency grammar.


Dependency grammars

The following frameworks are dependency-based: ::*
Algebraic syntax Michael K. Brame (January 27, 1944 — August 16, 2010) was an American linguist and professor at the University of Washington, and founding editor of the peer-reviewed research journal, ''Linguistic Analysis''. He was known for his theory of recur ...
::* Operator grammar ::* Link grammar ::*
Functional generative description Functional generative description (FGD) is a linguistic framework developed at Charles University in Prague since the 1960s by a team led by Petr Sgall. Based on the dependency grammar formalism, it is a stratificational grammar formalism that trea ...
::*
Lexicase Lexicase is a type of dependency grammar originally developed beginning in the early 1970s by Stanley Starosta at the University of Hawaii. Dozens of Starosta's graduate students also contributed to the theory and wrote at least 20 doctoral dissert ...
::*
Meaning–text theory Meaning–text theory (MTT) is a theoretical linguistic framework, first put forward in Moscow by Aleksandr Žolkovskij and Igor Mel’čuk, for the construction of models of natural language. The theory provides a large and elaborate basis for ...
::* Word grammar ::
Extensible dependency grammar
::* Universal Dependencies Link grammar is similar to dependency grammar, but link grammar does not include directionality between the linked words, and thus does not describe head-dependent relationships. Hybrid dependency/phrase structure grammar uses dependencies between words, but also includes dependencies between phrasal nodes – see for example the Quranic Arabic Dependency Treebank. The derivation trees of tree-adjoining grammar are dependency structures, although the full trees of TAG rendered in terms of phrase structure, so in this regard, it is not clear whether TAG should be viewed more as a dependency or phrase structure grammar. There are major differences between the grammars just listed. In this regard, the dependency relation is compatible with other major tenets of theories of grammar. Thus like phrase structure grammars, dependency grammars can be mono- or multistratal, representational or derivational, construction- or rule-based.


Representing dependencies

There are various conventions that DGs employ to represent dependencies. The following schemata (in addition to the tree above and the trees further below) illustrate some of these conventions: :: The representations in (a–d) are trees, whereby the specific conventions employed in each tree vary. Solid lines are ''dependency edges'' and lightly dotted lines are ''projection lines''. The only difference between tree (a) and tree (b) is that tree (a) employs the category class to label the nodes whereas tree (b) employs the words themselves as the node labels. Tree (c) is a reduced tree insofar as the string of words below and projection lines are deemed unnecessary and are hence omitted. Tree (d) abstracts away from linear order and reflects just hierarchical order. The arrow arcs in (e) are an alternative convention used to show dependencies and are favored by Word Grammar. The brackets in (f) are seldom used, but are nevertheless quite capable of reflecting the dependency hierarchy; dependents appear enclosed in more brackets than their heads. And finally, the indentations like those in (g) are another convention that is sometimes employed to indicate the hierarchy of words. Dependents are placed underneath their heads and indented. Like tree (d), the indentations in (g) abstract away from linear order. The point to these conventions is that they are just that, namely conventions. They do not influence the basic commitment to dependency as the relation that is grouping syntactic units.


Types of dependencies

The dependency representations above (and further below) show syntactic dependencies. Indeed, most work in dependency grammar focuses on syntactic dependencies. Syntactic dependencies are, however, just one of three or four types of dependencies.
Meaning–text theory Meaning–text theory (MTT) is a theoretical linguistic framework, first put forward in Moscow by Aleksandr Žolkovskij and Igor Mel’čuk, for the construction of models of natural language. The theory provides a large and elaborate basis for ...
, for instance, emphasizes the role of semantic and morphological dependencies in addition to syntactic dependencies. A fourth type, prosodic dependencies, can also be acknowledged. Distinguishing between these types of dependencies can be important, in part because if one fails to do so, the likelihood that semantic, morphological, and/or prosodic dependencies will be mistaken for syntactic dependencies is great. The following four subsections briefly sketch each of these dependency types. During the discussion, the existence of syntactic dependencies is taken for granted and used as an orientation point for establishing the nature of the other three dependency types.


Semantic dependencies

Semantic dependencies are understood in terms of
predicate Predicate or predication may refer to: * Predicate (grammar), in linguistics * Predication (philosophy) * several closely related uses in mathematics and formal logic: **Predicate (mathematical logic) **Propositional function **Finitary relation, o ...
s and their arguments. The arguments of a predicate are semantically dependent on that predicate. Often, semantic dependencies overlap with and point in the same direction as syntactic dependencies. At times, however, semantic dependencies can point in the opposite direction of syntactic dependencies, or they can be entirely independent of syntactic dependencies. The hierarchy of words in the following examples show standard syntactic dependencies, whereas the arrows indicate semantic dependencies: :: The two arguments ''Sam'' and ''Sally'' in tree (a) are dependent on the predicate ''likes'', whereby these arguments are also syntactically dependent on ''likes''. What this means is that the semantic and syntactic dependencies overlap and point in the same direction (down the tree). Attributive adjectives, however, are predicates that take their head noun as their argument, hence ''big'' is a predicate in tree (b) that takes ''bones'' as its one argument; the semantic dependency points up the tree and therefore runs counter to the syntactic dependency. A similar situation obtains in (c), where the preposition predicate ''on'' takes the two arguments ''the picture'' and ''the wall''; one of these semantic dependencies points up the syntactic hierarchy, whereas the other points down it. Finally, the predicate ''to help'' in (d) takes the one argument ''Jim'' but is not directly connected to ''Jim'' in the syntactic hierarchy, which means that semantic dependency is entirely independent of the syntactic dependencies.


Morphological dependencies

Morphological dependencies obtain between words or parts of words. When a given word or part of a word influences the form of another word, then the latter is morphologically dependent on the former. Agreement and concord are therefore manifestations of morphological dependencies. Like semantic dependencies, morphological dependencies can overlap with and point in the same direction as syntactic dependencies, overlap with and point in the opposite direction of syntactic dependencies, or be entirely independent of syntactic dependencies. The arrows are now used to indicate morphological dependencies. : The plural ''houses'' in (a) demands the plural of the demonstrative determiner, hence ''these'' appears, not ''this'', which means there is a morphological dependency that points down the hierarchy from ''houses'' to ''these''. The situation is reversed in (b), where the singular subject ''Sam'' demands the appearance of the agreement suffix ''-s'' on the finite verb ''works'', which means there is a morphological dependency pointing up the hierarchy from ''Sam'' to ''works''. The type of determiner in the German examples (c) and (d) influences the inflectional suffix that appears on the adjective ''alt''. When the indefinite article ''ein'' is used, the strong masculine ending ''-er'' appears on the adjective. When the definite article ''der'' is used, in contrast, the weak ending ''-e'' appears on the adjective. Thus since the choice of determiner impacts the morphological form of the adjective, there is a morphological dependency pointing from the determiner to the adjective, whereby this morphological dependency is entirely independent of the syntactic dependencies. Consider further the following French sentences: :: The masculine subject ''le chien'' in (a) demands the masculine form of the predicative adjective ''blanc'', whereas the feminine subject ''la maison'' demands the feminine form of this adjective. A morphological dependency that is entirely independent of the syntactic dependencies therefore points again across the syntactic hierarchy. Morphological dependencies play an important role in typological studies. Languages are classified as mostly head-marking (''Sam work-s'') or mostly dependent-marking (''these houses''), whereby most if not all languages contain at least some minor measure of both head and dependent marking.


Prosodic dependencies

Prosodic dependencies are acknowledged in order to accommodate the behavior of
clitic In morphology and syntax, a clitic (, backformed from Greek "leaning" or "enclitic"Crystal, David. ''A First Dictionary of Linguistics and Phonetics''. Boulder, CO: Westview, 1980. Print.) is a morpheme that has syntactic characteristics of a ...
s. A clitic is a syntactically autonomous element that is prosodically dependent on a host. A clitic is therefore integrated into the prosody of its host, meaning that it forms a single word with its host. Prosodic dependencies exist entirely in the linear dimension (horizontal dimension), whereas standard syntactic dependencies exist in the hierarchical dimension (vertical dimension). Classic examples of clitics in English are reduced auxiliaries (e.g. ''-ll'', ''-s'', ''-ve'') and the possessive marker ''-s''. The prosodic dependencies in the following examples are indicated with the hyphen and the lack of a vertical projection line: The hyphens and lack of projection lines indicate prosodic dependencies. A hyphen that appears on the left of the clitic indicates that the clitic is prosodically dependent on the word immediately to its left (''He'll'', ''There's''), whereas a hyphen that appears on the right side of the clitic (not shown here) indicates that the clitic is prosodically dependent on the word that appears immediately to its right. A given clitic is often prosodically dependent on its syntactic dependent (''He'll'', ''There's'') or on its head (''would've''). At other times, it can depend prosodically on a word that is neither its head nor its immediate dependent (''Florida's'').


Syntactic dependencies

Syntactic dependencies are the focus of most work in DG, as stated above. How the presence and the direction of syntactic dependencies are determined is of course often open to debate. In this regard, it must be acknowledged that the validity of syntactic dependencies in the trees throughout this article is being taken for granted. However, these hierarchies are such that many DGs can largely support them, although there will certainly be points of disagreement. The basic question about how syntactic dependencies are discerned has proven difficult to answer definitively. One should acknowledge in this area, however, that the basic task of identifying and discerning the presence and direction of the syntactic dependencies of DGs is no easier or harder than determining the constituent groupings of phrase structure grammars. A variety of heuristics are employed to this end, basic tests for constituents being useful tools; the syntactic dependencies assumed in the trees in this article are grouping words together in a manner that most closely matches the results of standard permutation, substitution, and ellipsis tests for constituents.
Etymological Etymology () The New Oxford Dictionary of English (1998) – p. 633 "Etymology /ˌɛtɪˈmɒlədʒi/ the study of the class in words and the way their meanings have changed throughout time". is the study of the history of the form of words a ...
considerations also provide helpful clues about the direction of dependencies. A promising principle upon which to base the existence of syntactic dependencies is distribution. When one is striving to identify the root of a given phrase, the word that is most responsible for determining the distribution of that phrase as a whole is its root.


Linear order and discontinuities

Traditionally, DGs have had a different approach to linear order (word order) than phrase structure grammars. Dependency structures are minimal compared to their phrase structure counterparts, and these minimal structures allow one to focus intently on the two ordering dimensions. Separating the vertical dimension (hierarchical order) from the horizontal dimension (linear order) is easily accomplished. This aspect of dependency structures has allowed DGs, starting with Tesnière (1959), to focus on hierarchical order in a manner that is hardly possible for phrase structure grammars. For Tesnière, linear order was secondary to hierarchical order insofar as hierarchical order preceded linear order in the mind of a speaker. The stemmas (trees) that Tesnière produced reflected this view; they abstracted away from linear order to focus almost entirely on hierarchical order. Many DGs that followed Tesnière adopted this practice, that is, they produced tree structures that reflect hierarchical order alone, e.g. :: The traditional focus on hierarchical order generated the impression that DGs have little to say about linear order, and it has contributed to the view that DGs are particularly well-suited to examine languages with free word order. A negative result of this focus on hierarchical order, however, is that there is a dearth of DG explorations of particular word order phenomena, such as of standard discontinuities. Comprehensive dependency grammar accounts of
topicalization Topicalization is a mechanism of syntax that establishes an expression as the sentence or clause topic by having it appear at the front of the sentence or clause (as opposed to in a canonical position further to the right). This involves a phrasal ...
, ''wh''-fronting, scrambling, and extraposition are mostly absent from many established DG frameworks. This situation can be contrasted with phrase structure grammars, which have devoted tremendous effort to exploring these phenomena. The nature of the dependency relation does not, however, prevent one from focusing on linear order. Dependency structures are as capable of exploring word order phenomena as phrase structures. The following trees illustrate this point; they represent one way of exploring discontinuities using dependency structures. The trees suggest the manner in which common discontinuities can be addressed. An example from German is used to illustrate a scrambling discontinuity: :: The a-trees on the left show projectivity violations (= crossing lines), and the b-trees on the right demonstrate one means of addressing these violations. The displaced constituent takes on a word as its
head A head is the part of an organism which usually includes the ears, brain, forehead, cheeks, chin, eyes, nose, and mouth, each of which aid in various sensory functions such as sight, hearing, smell, and taste. Some very simple animals ...
that is not its
governor A governor is an administrative leader and head of a polity or political region, ranking under the head of state and in some cases, such as governors-general, as the head of state's official representative. Depending on the type of political ...
. The words in red mark the catena (=chain) of words that extends from the root of the displaced constituent to the
governor A governor is an administrative leader and head of a polity or political region, ranking under the head of state and in some cases, such as governors-general, as the head of state's official representative. Depending on the type of political ...
of that constituent. Discontinuities are then explored in terms of these catenae. The limitations on topicalization, ''wh''-fronting, scrambling, and extraposition can be explored and identified by examining the nature of the catenae involved.


Syntactic functions

Traditionally, DGs have treated the syntactic functions (= grammatical functions, grammatical relations) as primitive. They posit an inventory of functions (e.g. subject, object, oblique, determiner, attribute, predicative, etc.). These functions can appear as labels on the dependencies in the tree structures, e.g.For discussion and examples of the labels for syntactic functions that are attached to dependency edges and arcs, see for instance Mel'cuk (1988:22, 69) and van Valin (2001:102ff.). :: The syntactic functions in this tree are shown in green: ATTR (attribute), COMP-P (complement of preposition), COMP-TO (complement of to), DET (determiner), P-ATTR (prepositional attribute), PRED (predicative), SUBJ (subject), TO-COMP (to complement). The functions chosen and abbreviations used in the tree here are merely representative of the general stance of DGs toward the syntactic functions. The actual inventory of functions and designations employed vary from DG to DG. As a primitive of the theory, the status of these functions is very different from that in some phrase structure grammars. Traditionally, phrase structure grammars derive the syntactic functions from the constellation. For instance, the object is identified as the NP appearing inside finite VP, and the subject as the NP appearing outside of finite VP. Since DGs reject the existence of a finite VP constituent, they were never presented with the option to view the syntactic functions in this manner. The issue is a question of what comes first: traditionally, DGs take the syntactic functions to be primitive and they then derive the constellation from these functions, whereas phrase structure grammars traditionally take the constellation to be primitive and they then derive the syntactic functions from the constellation. This question about what comes first (the functions or the constellation) is not an inflexible matter. The stances of both grammar types (dependency and phrase structure) are not narrowly limited to the traditional views. Dependency and phrase structure are both fully compatible with both approaches to the syntactic functions. Indeed, monostratal systems, that are solely based on dependency or phrase structure, will likely reject the notion that the functions are derived from the constellation or that the constellation is derived from the functions. They will take both to be primitive, which means neither can be derived from the other.


See also

* Catena *
Constituent Constituent or constituency may refer to: Politics * An individual voter within an electoral district, state, community, or organization * Advocacy group or constituency * Constituent assembly * Constituencies of Namibia Other meanings * Consti ...
* Dependency relation (in mathematics) * Discontinuity * Finite verb *
Head-directionality parameter In linguistics, head directionality is a proposed parameter that classifies languages according to whether they are head-initial (the head of a phrase precedes its complements) or head-final (the head follows its complements). The head is ...
* Igor Mel'čuk * Parse tree * Phrase structure grammar *
Predicate Predicate or predication may refer to: * Predicate (grammar), in linguistics * Predication (philosophy) * several closely related uses in mathematics and formal logic: **Predicate (mathematical logic) **Propositional function **Finitary relation, o ...
*
Recursive categorical syntax Michael K. Brame (January 27, 1944 — August 16, 2010) was an American linguist and professor at the University of Washington, and founding editor of the peer-reviewed research journal, ''Linguistic Analysis''. He was known for his theory of recu ...
*
Tree (data structure) In computer science, a tree is a widely used abstract data type that represents a hierarchical tree structure with a set of connected nodes. Each node in the tree can be connected to many children (depending on the type of tree), but must be c ...
* Universal Dependencies * Verb phrase


Notes


References

* *Coseriu, E. 1980. Un précurseur méconnu de la syntaxe structurale: H. Tiktin. In ''Recherches de Linguistique : Hommage à Maurice Leroy''. Éditions de l’Université de Bruxelles, 48–62. *Engel, U. 1994. Syntax der deutschen Sprache, 3rd edition. Berlin: Erich Schmidt Verlag. * *Groß, T. 2011. Clitics in dependency morphology. Depling 2011 Proceedings, 58–68. * *Heringer, H. 1996. Deutsche Syntax dependentiell. Tübingen: Stauffenburg. *Hays, D. 1960. Grouping and dependency theories. P-1910, RAND Corporation. *Hays, D. 1964
Dependency theory: A formalism and some observations.
''Language'', 40: 511-525. Reprinted in ''Syntactic Theory 1, Structuralist'', edited by Fred W. Householder. Penguin, 1972. * *Hudson, R. 1990. English Word Grammar. Oxford: Basil Blackwell. *Hudson, R. 2007
Language Networks: The New Word Grammar
Oxford University Press. *Imrényi, A. 2013. Constituency or dependency? Notes on Sámuel Brassai’s syntactic model of Hungarian. In Szigetvári, Péter (ed.), VLlxx. ''Papers Presented to László Varga on his 70th Birthday''. Budapest: Tinta. 167–182. *Kern, F. 1883. ''Zur Methodik des deutschen Unterrichts''. Berlin: Nicolaische Verlags-Buchhandlung. *Kern, F. 1884. ''Grundriss der Deutschen Satzlehre''. Berlin: Nicolaische Verlags-Buchhandlung. *Liu, H. 2009. Dependency Grammar: from Theory to Practice. Beijing: Science Press. *Lobin, H. 2003. Koordinationssyntax als prozedurales Phänomen. Tübingen: Gunter Narr-Verlag. * * *Melʹc̆uk, I. 2003. Levels of dependency in linguistic description: Concepts and problems. In Ágel et al., 170–187. *Miller, J. 2011
A critical introduction to syntax
London: continuum. *Nichols, J. 1986. Head-marking and dependent-marking languages. Language 62, 56–119. *Ninio, A. 2006. Language and the learning curve: A new theory of syntactic development. Oxford: Oxford University Press. *Osborne, T. 2019
A Dependency Grammar of English: An Introduction and Beyond
Amsterdam: John Benjamins. https://doi.org/10.1075/z.224 *Osborne, T., M. Putnam, and T. Groß 2011
Bare phrase structure, label-less trees, and specifier-less syntax: Is Minimalism becoming a dependency grammar?
The Linguistic Review 28, 315–364. *Osborne, T., M. Putnam, and T. Groß 2012
Catenae: Introducing a novel unit of syntactic analysis
Syntax 15, 4, 354–396. *Owens, J. 1984
On getting a head: A problem in dependency grammar
Lingua 62, 25–42. *Percival, K. 1976. On the historical source of immediate-constituent analysis. In: Notes from the linguistic underground, James McCawley (ed.), Syntax and Semantics 7, 229–242. New York: Academic Press. *Percival, K. 1990
Reflections on the history of dependency notions in linguistics
Historiographia Linguistica, 17, 29–47. * Robinson, J. 1970
Dependency structures and transformational rules
Language 46, 259–285. *Schubert, K. 1988. Metataxis: Contrastive dependency syntax for machine translation. Dordrecht: Foris. *Sgall, P., E. Hajičová, and J. Panevová 1986. The meaning of the sentence in its semantic and pragmatic aspects. Dordrecht: D. Reidel Publishing Company. *Starosta, S. 1988. The case for lexicase. London: Pinter Publishers. *Tesnière, L. 1959. Éléments de syntaxe structurale. Paris: Klincksieck. *Tesnière, L. 1966. Éléments de syntaxe structurale, 2nd edition. Paris: Klincksieck. *Tesnière, L. 2015. Elements of structural syntax nglish translation of Tesnière 1966 John Benjamins, Amsterdam. *van Valin, R. 2001. An introduction to syntax. Cambridge, UK: Cambridge University Press.


External links


Universal Dependencies – a set of treebanks in a harmonized dependency grammar representation
{{Authority control Dependency grammar Grammar frameworks Natural language parsing