Growing Context-sensitive Grammar
In formal language theory, a growing context-sensitive grammar is a context-sensitive grammar in which the productions increase the length of the sentences being generated. These grammars are thus noncontracting and context-sensitive. A growing context-sensitive language is a context-sensitive language generated by these grammars. In these grammars the "start symbol" S does not appear on the right hand side of any production rule and the length of the right hand side of each production exceeds the length of the left side, unless the left side is S. Here: p.316-317 These grammars were introduced by Dahlhaus and Warmuth.. Here: p.197-198 They were later shown to be equivalent to the acyclic context-sensitive grammars. Membership in any growing context-sensitive language is polynomial time computable; however, the ''uniform'' problem of deciding whether a given string belongs to the language generated by a given growing or acyclic context-sensitive grammar is NP-complete. See also ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Formal Language Theory
In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules. The alphabet of a formal language consists of symbols, letters, or tokens that concatenate into strings of the language. Each string concatenated from symbols of this alphabet is called a word, and the words that belong to a particular formal language are sometimes called ''well-formed words'' or '' well-formed formulas''. A formal language is often defined by means of a formal grammar such as a regular grammar or context-free grammar, which consists of its formation rules. In computer science, formal languages are used among others as the basis for defining the grammar of programming languages and formalized versions of subsets of natural languages in which the words of the language represent concepts that are associated with particular meanings or semantics. In computational complex ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Context-sensitive Grammar
A context-sensitive grammar (CSG) is a formal grammar in which the left-hand sides and right-hand sides of any production rules may be surrounded by a context of terminal and nonterminal symbols. Context-sensitive grammars are more general than context-free grammars, in the sense that there are languages that can be described by CSG but not by context-free grammars. Context-sensitive grammars are less general (in the same sense) than unrestricted grammars. Thus, CSG are positioned between context-free and unrestricted grammars in the Chomsky hierarchy. A formal language that can be described by a context-sensitive grammar, or, equivalently, by a noncontracting grammar or a linear bounded automaton, is called a context-sensitive language. Some textbooks actually define CSGs as non-contracting, although this is not how Noam Chomsky defined them in 1959. This choice of definition makes no difference in terms of the languages generated (i.e. the two definitions are weakly equival ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Noncontracting Grammar
In formal language theory, a grammar is noncontracting (or monotonic) if all of its production rules are of the form α → β where α and β are strings of nonterminal and terminal symbols, and the length of α is less than or equal to that of β, , α, ≤ , β, , that is β is not shorter than α. A grammar is essentially noncontracting if there may be one exception, namely, a rule ''S'' → ε where ''S'' is the start symbol and ε the empty string, and furthermore, ''S'' never occurs in the right-hand side of any rule. None of the rules of a noncontracting grammar decreases the length of the string that is being rewritten. If each rule even properly increases the length, the grammar is called a growing context-sensitive grammar. History Chomsky (1963) called a noncontracting grammar a type 1 grammar; in the same work, he called a context-sensitive grammar a "type 2 grammar", and he proved that these two are weakly equivalent (context-free grammars were desi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Context-sensitive Language
In formal language theory, a context-sensitive language is a language that can be defined by a context-sensitive grammar (and equivalently by a noncontracting grammar). Context-sensitive is one of the four types of grammars in the Chomsky hierarchy. Computational properties Computationally, a context-sensitive language is equivalent to a linear bounded nondeterministic Turing machine, also called a linear bounded automaton. That is a non-deterministic Turing machine with a tape of only kn cells, where n is the size of the input and k is a constant associated with the machine. This means that every formal language that can be decided by such a machine is a context-sensitive language, and every context-sensitive language can be decided by such a machine. This set of languages is also known as NLINSPACE or NSPACE(''O''(''n'')), because they can be accepted using linear space on a non-deterministic Turing machine. The class LINSPACE (or DSPACE(''O''(''n''))) is defined the same, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Acyclic Context-sensitive Grammar
Acyclic may refer to: * In chemistry, a compound which is an open-chain compound, e.g. alkanes and acyclic aliphatic compounds * In mathematics: ** A graph without a cycle Cycle, cycles, or cyclic may refer to: Anthropology and social sciences * Cyclic history, a theory of history * Cyclical theory, a theory of American political history associated with Arthur Schlesinger, Sr. * Social cycle, various cycles in soc ..., especially *** A directed acyclic graph ** An acyclic complex is a chain complex all of whose homology groups are zero * In economics, an economic indicator with little or no correlation to the business cycle {{disambig ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Polynomial Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity is generally express ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Complexity Of Constraint Satisfaction
The complexity of constraint satisfaction is the application of computational complexity theory on constraint satisfaction. It has mainly been studied for discriminating between tractable and intractability (complexity), intractable classes of constraint satisfaction problems on finite domains. Solving a constraint satisfaction problem on a finite domain is an NP-complete problem in general. Research has shown a number of P (complexity), polynomial-time subcases, mostly obtained by restricting either the allowed domains or constraints or the way constraints can be placed over the variables. Research has also established a relationship between the constraint satisfaction problem and problems in other areas such as finite model theory and databases. Overview Establishing whether a constraint satisfaction problem on a finite domain has solutions is an NP complete problem in general. This is an easy consequence of a number of other NP complete problems being expressible as constraint ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
NP-complete
In computational complexity theory, a problem is NP-complete when: # it is a problem for which the correctness of each solution can be verified quickly (namely, in polynomial time) and a brute-force search algorithm can find a solution by trying all possible solutions. # the problem can be used to simulate every other problem for which we can verify quickly that a solution is correct. In this sense, NP-complete problems are the hardest of the problems to which solutions can be verified quickly. If we could find solutions of some NP-complete problem quickly, we could quickly find the solutions of every other problem to which a given solution can be easily verified. The name "NP-complete" is short for "nondeterministic polynomial-time complete". In this name, "nondeterministic" refers to nondeterministic Turing machines, a way of mathematically formalizing the idea of a brute-force search algorithm. Polynomial time refers to an amount of time that is considered "quick" for a det ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Noncontracting Grammar
In formal language theory, a grammar is noncontracting (or monotonic) if all of its production rules are of the form α → β where α and β are strings of nonterminal and terminal symbols, and the length of α is less than or equal to that of β, , α, ≤ , β, , that is β is not shorter than α. A grammar is essentially noncontracting if there may be one exception, namely, a rule ''S'' → ε where ''S'' is the start symbol and ε the empty string, and furthermore, ''S'' never occurs in the right-hand side of any rule. None of the rules of a noncontracting grammar decreases the length of the string that is being rewritten. If each rule even properly increases the length, the grammar is called a growing context-sensitive grammar. History Chomsky (1963) called a noncontracting grammar a type 1 grammar; in the same work, he called a context-sensitive grammar a "type 2 grammar", and he proved that these two are weakly equivalent (context-free grammars were desi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |