HOME

TheInfoList



OR:

Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of
phrase structure grammar The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue ( Post canonical systems). Some authors, however, reserve the term for more restricted grammars in ...
, as opposed to a
dependency grammar Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation (as opposed to the ''constituency relation'' of phrase structure) and that can be traced back primarily to the work of Lucien Tesniè ...
, and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (includin ...
( data type theory and
knowledge representation Knowledge representation and reasoning (KRR, KR&R, KR²) is the field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can use to solve complex tasks such as diagnosing a medic ...
) and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing. An HPSG grammar includes principles and grammar rules and lexicon entries which are normally not considered to belong to a grammar. The formalism is based on lexicalism. This means that the lexicon is more than just a list of entries; it is in itself richly structured. Individual entries are marked with types. Types form a hierarchy. Early versions of the grammar were very lexicalized with few grammatical rules (schema). More recent research has tended to add more and richer rules, becoming more like
construction grammar Construction grammar (often abbreviated CxG) is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human ...
. The basic type HPSG deals with is the sign.
Word A word is a basic element of language that carries an objective or practical meaning, can be used on its own, and is uninterruptible. Despite the fact that language speakers often have an intuitive grasp of what a word is, there is no consen ...
s and
phrase In syntax and grammar, a phrase is a group of words or singular word acting as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can con ...
s are two different subtypes of sign. A word has two features: '' HON' (the sound, the
phonetic Phonetics is a branch of linguistics that studies how humans produce and perceive sounds, or in the case of sign languages, the equivalent aspects of sign. Linguists who specialize in studying the physical properties of speech are phoneticians. ...
form) and '' YNSEM' (the
syntactic In linguistics, syntax () is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure ( constituenc ...
and semantic information), both of which are split into subfeatures. Signs and rules are formalized as typed feature structures.


Sample grammar

HPSG generates strings by combining signs, which are defined by their location within a type hierarchy and by their internal feature structure, represented by attribute value matrices (AVMs). Pollard, Carl; Ivan A. Sag. (1994).
Head-driven phrase structure grammar
'. Chicago: University of Chicago Press.
Features take types or lists of types as their values, and these values may in turn have their own feature structure. Grammatical rules are largely expressed through the constraints signs place on one another. A sign's feature structure describes its phonological, syntactic, and semantic properties. In common notation, AVMs are written with features in upper case and types in italicized lower case. Numbered indices in an AVM represent token identical values. In the simplified AVM for the word (in this case the verb, not the noun as in "nice walks for the weekend") "walks" below, the verb's categorical information (CAT) is divided into features that describe it (HEAD) and features that describe its arguments (VALENCE). "Walks" is a sign of type ''word'' with a head of type ''verb''. As an intransitive verb, "walks" has no complement but requires a subject that is a third person singular noun. The semantic value of the subject (CONTENT) is co-indexed with the verb's only argument (the individual doing the walking). The following AVM for "she" represents a sign with a SYNSEM value that could fulfill those requirements. Signs of type ''phrase'' unify with one or more children and propagate information upward. The following AVM encodes the immediate dominance rule for a ''head-subj-phrase'', which requires two children: the head child (a verb) and a non-head child that fulfills the verb's SUBJ constraints. The end result is a sign with a verb head, empty subcategorization features, and a phonological value that orders the two children. Although the actual grammar of HPSG is composed entirely of feature structures, linguists often use trees to represent the unification of signs where the equivalent AVM would be unwieldy.


Implementations

Various
parsers Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term ''parsing'' comes from Lat ...
based on the HPSG formalism have been written and optimizations are currently being investigated. An example of a system analyzing German sentences is provided by the Freie Universität Berlin. In addition the CoreGram project of the Grammar Group of the Freie Universität Berlin provides open source grammars that were implemented in the TRALE system. Currently there are grammars for German, Danish,
Mandarin Chinese Mandarin (; ) is a group of Chinese (Sinitic) dialects that are natively spoken across most of northern and southwestern China. The group includes the Beijing dialect, the basis of the phonology of Standard Chinese, the official language ...
,
Maltese Maltese may refer to: * Someone or something of, from, or related to Malta * Maltese alphabet * Maltese cuisine * Maltese culture * Maltese language, the Semitic language spoken by Maltese people * Maltese people, people from Malta or of Malte ...
, and Persian that share a common core and are publicly available. Large HPSG grammars of various languages are being developed in the Deep Linguistic Processing with HPSG Initiative ( DELPH-IN). Wide-coverage grammars of English, German, and Japanese are available under an open-source license. These grammars can be used with a variety of inter-compatible open-source HPSG parsers: LKB, PET, Ace, and ''agree''. All of these produce semantic representations in the format of “Minimal Recursion Semantics,” MRS. The declarative nature of the HPSG formalism means that these computational grammars can typically be used for both parsing and generation (producing surface strings from semantic inputs). Treebanks, also distributed by DELPH-IN, are used to develop and test the grammars, as well as to train ranking models to decide on plausible interpretations when parsing (or realizations when generating). ''Enju'' is a freely available wide-coverage probabilistic HPSG parser for English developed by the Tsujii Laboratory at The University of Tokyo in Japan.Tsuji Lab: Enju parser home page
(retrieved Nov 24, 2009)


See also

* Lexical-functional grammar * Minimal recursion semantics * Relational grammar * Situation semantics * Syntax *
Transformational grammar In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combin ...
* Type Description Language


References


Further reading

* Carl Pollard, Ivan A. Sag (1987): ''Information-based Syntax and Semantics. Volume 1: Fundamentals''. Stanford: CSLI Publications. * Carl Pollard, Ivan A. Sag (1994): ''Head-Driven Phrase Structure Grammar''. Chicago: University of Chicago Press.

* Ivan A. Sag, Thomas Wasow,
Emily M. Bender Emily M. Bender (born 1973) is an American linguist who is a professor at the University of Washington. She specializes in computational linguistics and natural language processing. She is also the director of the University of Washington's Comp ...
(2003): ''Syntactic Theory: a formal introduction, Second Edition''. Chicago: University of Chicago Press.

* *


External links


Stanford HPSG homepage
– includes on-line proceedings of an annual HPSG conference
Ohio State HPSG homepage

International Conference on Head-Driven Phrase Structure Grammar

DELPH-IN network for HPSG grammar development




* ttp://hpsg.fu-berlin.de/HPSG-Bib/ Bibliography of HPSG Publications
LaTeX package for drawing AVMs
– includes documentation {{DEFAULTSORT:Head-Driven Phrase Structure Grammar Generative linguistics Grammar frameworks Syntactic theories