HOME





Embedded Pushdown Automaton
An embedded pushdown automaton or EPDA is a computational model for parsing languages generated by tree-adjoining grammars (TAGs). It is similar to the context-free grammar-parsing pushdown automaton, but instead of using a plain stack (data structure), stack to store symbols, it has a stack of iterated stacks that store symbols, giving TAGs a generative capacity between context-free and context-sensitive grammars, or a subset of mildly context-sensitive grammars. Embedded pushdown automata should not be confused with nested stack automata which have more computational power. History and applications EPDAs were first described by K. Vijay-Shanker in his 1988 doctoral thesis. They have since been applied to more complete descriptions of classes of mildly context-sensitive grammars and have had important roles in refining the Chomsky hierarchy. Various subgrammars, such as the linear indexed grammar, can thus be defined. While natural languages have traditionally been analyzed using ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Computational Model
A computational model uses computer programs to simulate and study complex systems using an algorithmic or mechanistic approach and is widely used in a diverse range of fields spanning from physics, engineering, chemistry and biology to economics, psychology, cognitive science and computer science. The system under study is often a complex nonlinear system for which simple, intuitive analytical solutions are not readily available. Rather than deriving a mathematical analytical solution to the problem, experimentation with the model is done by adjusting the parameters of the system in the computer, and studying the differences in the outcome of the experiments. Operation theories of the model can be derived/deduced from these computational experiments. Examples of common computational models are weather forecasting models, earth simulator models, flight simulator models, molecular protein folding models, Computational Engineering Models (CEM), and neural network models. Se ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tree-adjoining Grammar
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi. Tree-adjoining grammars are somewhat similar to context-free grammars, but the elementary unit of rewriting is the tree rather than the symbol. Whereas context-free grammars have rules for rewriting symbols as strings of other symbols, tree-adjoining grammars have rules for rewriting the nodes of trees as other trees (see tree (graph theory) and tree (data structure)). History TAG originated in investigations by Joshi and his students into the family of adjunction grammars (AG), the "string grammar" of Zellig Harris. AGs handle exocentric properties of language in a natural and effective way, but do not have a good characterization of endocentric constructions; the converse is true of rewrite grammars, or phrase-structure grammar (PSG). In 1969, Joshi introduced a family of grammars that exploits this complementarity by mixing the two types of rules. A few very simple rewrite rules suffice to ge ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Context-free Grammar
In formal language theory, a context-free grammar (CFG) is a formal grammar whose production rules can be applied to a nonterminal symbol regardless of its context. In particular, in a context-free grammar, each production rule is of the form : A\ \to\ \alpha with A a ''single'' nonterminal symbol, and \alpha a string of terminals and/or nonterminals (\alpha can be empty). Regardless of which symbols surround it, the single nonterminal A on the left hand side can always be replaced by \alpha on the right hand side. This distinguishes it from a context-sensitive grammar, which can have production rules in the form \alpha A \beta \rightarrow \alpha \gamma \beta with A a nonterminal symbol and \alpha, \beta, and \gamma strings of terminal and/or nonterminal symbols. A formal grammar is essentially a set of production rules that describe all possible strings in a given formal language. Production rules are simple replacements. For example, the first rule in the picture, : \lan ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pushdown Automaton
In the theory of computation, a branch of theoretical computer science, a pushdown automaton (PDA) is a type of automaton that employs a stack. Pushdown automata are used in theories about what can be computed by machines. They are more capable than finite-state machines but less capable than Turing machines (see below). Deterministic pushdown automata can recognize all deterministic context-free languages while nondeterministic ones can recognize all context-free languages, with the former often used in parser design. The term "pushdown" refers to the fact that the stack can be regarded as being "pushed down" like a tray dispenser at a cafeteria, since the operations never work on elements other than the top element. A stack automaton, by contrast, does allow access to and operations on deeper elements. Stack automata can recognize a strictly larger set of languages than pushdown automata. A nested stack automaton allows full access, and also allows stacked values to be ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Stack (data Structure)
In computer science, a stack is an abstract data type that serves as a collection of elements with two main operations: * Push, which adds an element to the collection, and * Pop, which removes the most recently added element. Additionally, a peek operation can, without modifying the stack, return the value of the last element added. The name ''stack'' is an analogy to a set of physical items stacked one atop another, such as a stack of plates. The order in which an element added to or removed from a stack is described as last in, first out, referred to by the acronym LIFO. As with a stack of physical objects, this structure makes it easy to take an item off the top of the stack, but accessing a datum deeper in the stack may require removing multiple other items first. Considered a sequential collection, a stack has one end which is the only position at which the push and pop operations may occur, the ''top'' of the stack, and is fixed at the other end, the ''bottom''. A s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Context-sensitive Grammar
A context-sensitive grammar (CSG) is a formal grammar in which the left-hand sides and right-hand sides of any Production (computer science), production rules may be surrounded by a context of terminal symbol, terminal and nonterminal symbols. Context-sensitive grammars are more general than context-free grammars, in the sense that there are languages that can be described by a CSG but not by a context-free grammar. Context-sensitive grammars are less general (in the same sense) than unrestricted grammars. Thus, CSGs are positioned between context-free and unrestricted grammars in the Chomsky hierarchy. A formal language that can be described by a context-sensitive grammar, or, equivalently, by a noncontracting grammar or a linear bounded automaton, is called a context-sensitive language. Some textbooks actually define CSGs as non-contracting, although this is not how Noam Chomsky defined them in 1959. This choice of definition makes no difference in terms of the languages generated ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Mildly Context-sensitive Grammar
In computational linguistics, the term mildly context-sensitive grammar formalisms refers to several grammar formalisms that have been developed in an effort to provide adequate descriptions of the syntactic structure of natural language. Every mildly context-sensitive grammar formalism defines a class of mildly context-sensitive grammars (the grammars that can be specified in the formalism), and therefore also a class of mildly context-sensitive languages (the formal languages generated by the grammars). Background By 1985, several researchers in descriptive and mathematical linguistics had provided evidence against the hypothesis that the syntactic structure of natural language can be adequately described by context-free grammars.Riny Huybregts. "The Weak Inadequacy of Context-Free Phrase Structure Grammars". In Ger de Haan, Mieke Trommelen, and Wim Zonneveld, editors, ''Van periferie naar kern'', pages 81–99. Foris, Dordrecht, The Netherlands, 1984.Stuart M. Shieber.Evid ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Nested Stack Automata
In automata theory, a nested stack automaton is a finite automaton that can make use of a stack Stack may refer to: Places * Stack Island, an island game reserve in Bass Strait, south-eastern Australia, in Tasmania’s Hunter Island Group * Blue Stack Mountains, in Co. Donegal, Ireland People * Stack (surname) (including a list of people ... containing data which can be additional stacks. Like a stack automaton, a nested stack automaton may step up or down in the stack, and read the current symbol; in addition, it may at any place create a new stack, operate on that one, eventually destroy it, and continue operating on the old stack. This way, stacks can be nested recursively to an arbitrary depth; however, the automaton always operates on the innermost stack only. A nested stack automaton is capable of recognizing an indexed language, and in fact the class of indexed languages is exactly the class of languages accepted by one-way nondeterministic nested stack automata ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

University Of Pennsylvania
The University of Pennsylvania (Penn or UPenn) is a Private university, private Ivy League research university in Philadelphia, Pennsylvania, United States. One of nine colonial colleges, it was chartered in 1755 through the efforts of founder and first president Benjamin Franklin, who had advocated for an educational institution that trained leaders in academia, commerce, and public service. The university has four undergraduate schools and 12 graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences, the University of Pennsylvania School of Engineering and Applied Science, School of Engineering and Applied Science, the Wharton School, and the University of Pennsylvania School of Nursing, School of Nursing. Among its graduate schools are its University of Pennsylvania Law School, law school, whose first professor, James Wilson (Founding Father), James Wilson, helped write the Constitution of the United States, U.S. Cons ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Chomsky Hierarchy
The Chomsky hierarchy in the fields of formal language theory, computer science, and linguistics, is a containment hierarchy of classes of formal grammars. A formal grammar describes how to form strings from a formal language's alphabet that are valid according to the language's syntax. The linguist Noam Chomsky theorized that four different classes of formal grammars existed that could generate increasingly complex languages. Each class can also completely generate the language of all inferior classes (set inclusive). History The general idea of a hierarchy of grammars was first described by Noam Chomsky in "Three models for the description of language" during the formalization of transformational-generative grammar (TGG). Marcel-Paul Schützenberger also played a role in the development of the theory of formal languages; the paper "The algebraic theory of context free languages" describes the modern hierarchy, including context-free grammars. Independently, alongside linguis ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linear Indexed Grammar
Indexed grammars are a generalization of context-free grammars in that nonterminals are equipped with lists of ''flags'', or ''index symbols''. The language produced by an indexed grammar is called an indexed language. Definition Modern definition by Hopcroft and Ullman In contemporary publications following Hopcroft and Ullman (1979), an indexed grammar is formally defined a 5-tuple ''G'' = ⟨''N'',''T'',''F'',''P'',''S''⟩ where * ''N'' is a set of variables or nonterminal symbols, * ''T'' is a set ("alphabet") of terminal symbols, * ''F'' is a set of so-called ''index symbols'', or ''indices'', * ''S'' ∈ ''N'' is the '' start symbol'', and * ''P'' is a finite set of '' productions''. In productions as well as in derivations of indexed grammars, a string ("stack") ''σ'' ∈ ''F'' * of index symbols is attached to every nonterminal symbol ''A'' ∈ ''N'', denoted by ''A'' 'σ''" and " are meta symbols to indicate the stack. Terminal symbols may not be followed by inde ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Transformational-generative Grammar
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) was the earliest model of grammar proposed within the research tradition of generative grammar. Like current generative theories, it treated grammar as a system of formal rules that generate all and only grammatical sentences of a given language. What was distinctive about transformational grammar was that it posited transformation rules that mapped a sentence's deep structure to its pronounced form. For example, in many variants of transformational grammar, the English active voice sentence "Emma saw Daisy" and its passive counterpart "Daisy was seen by Emma" share a common deep structure generated by phrase structure rules, differing only in that the latter's structure is modified by a passivization transformation rule. Basic mechanisms Transformational grammar was a species of generative grammar and shared many of its goals and postulations, including the notion of linguistics as a c ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]