Log-space Computable Function
In computational complexity theory, a log-space computable function is a function f\colon \Sigma^\ast \rightarrow \Sigma^\ast that requires only O(\log n) memory to be computed (this restriction does not apply to the size of the output). The computation is generally done by means of a log-space transducer. Log-space reductions The main use for log-space computable functions is in log-space reductions. This is a means of transforming an instance of one problem into an instance of another problem, using only logarithmic space. Examples of log-space computable functions * Function converting a problem of a non-deterministic Turing machine that decides a language ''A'' in log-space to ST-connectivity.Sipser (2006) International Second Edition, p. 328. Notes References * Sipser, Michael (2006), Introduction to the Theory of Computation', Cengage Learning Cengage Group is an American educational content, technology, and services company for the higher education, K-12, prof ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Computational Complexity Theory
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of compu ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Space Complexity
The space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely. Similar to time complexity, space complexity is often expressed asymptotically in big O notation, such as O(n), O(n\log n), O(n^\alpha), O(2^n), etc., where is a characteristic of the input influencing space complexity. Space complexity classes Analogously to time complexity classes DTIME(f(n)) and NTIME(f(n)), the complexity classes DSPACE(f(n)) and NSPACE(f(n)) are the sets of languages that are decidable by deterministic (respectively, non-deterministic) Turing machines that use O(f(n)) space. The complexity classes PSPACE and NPSPACE allow f to be any polynomial, analogously to P and NP. That is, :\mathsf = \bigcup_ \mathsf(n^c) and :\mathsf = \bigcup_ \mathsf(n^c) Relationships between classes The s ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Log-space Transducer
In computational complexity theory, a log space transducer (LST) is a type of Turing machine used for log-space reductions. A log space transducer, M, has three tapes: * A read-only ''input'' tape. * A read/write ''work'' tape (bounded to at most O(\log n) symbols). * A write-only, write-once ''output'' tape. M will be designed to compute a log-space computable function f\colon \Sigma^\ast \rightarrow \Sigma^\ast (where \Sigma is the alphabet of both the ''input'' and ''output'' tapes). If M is executed with w on its ''input'' tape, when the machine halts, it will have f(w) remaining on its ''output'' tape. A language A \subseteq \Sigma^\ast is said to be log-space reducible to a language B \subseteq \Sigma^\ast if there exists a log-space computable function f that will convert an input from problem A into an input to problem B in such a way that w \in A \iff f(w) \in B. This seems like a rather convoluted idea, but it has two useful properties that are desirable for a reduction ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Log-space Reduction
In computational complexity theory, a log-space reduction is a reduction computable by a deterministic Turing machine using logarithmic space. Conceptually, this means it can keep a constant number of pointers into the input, along with a logarithmic number of fixed-size integers.Arora & Barak (2009) p. 88 It is possible that such a machine may not have space to write down its own output, so the only requirement is that any given bit of the output be computable in log-space. Formally, this reduction is executed via a log-space transducer. Such a machine has polynomially-many configurations, so log-space reductions are also polynomial-time reductions. However, log-space reductions are probably weaker than polynomial-time reductions; while any non-empty, non-full language in P is polynomial-time reducible to any other non-empty, non-full language in P, a log-space reduction from an NL-complete language to a language in L, both of which would be languages in P, would imply the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Non-deterministic Turing Machine
In theoretical computer science, a nondeterministic Turing machine (NTM) is a theoretical model of computation whose governing rules specify more than one possible action when in some given situations. That is, an NTM's next state is ''not'' completely determined by its action and the current symbol it sees, unlike a deterministic Turing machine. NTMs are sometimes used in thought experiments to examine the abilities and limits of computers. One of the most important open problems in theoretical computer science is the P versus NP problem, which (among other equivalent formulations) concerns the question of how difficult it is to simulate nondeterministic computation with a deterministic computer. Background In essence, a Turing machine is imagined to be a simple computer that reads and writes symbols one at a time on an endless tape by strictly following a set of rules. It determines what action it should perform next according to its internal ''state'' and ''what symbol it cur ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Decider Turing Machine
In computability theory, a decider is a Turing machine that halts for every input. A decider is also called a total Turing machineKozen, 1997 as it represents a total function. Because it always halts, such a machine is able to decide whether a given string is a member of a formal language. The class of languages which can be decided by such machines is the set of recursive languages. Given an arbitrary Turing machine, determining whether it is a decider is an undecidable problem. This is a variant of the halting problem, which asks for whether a Turing machine halts on a specific input. Functions computable by total Turing machines In practice, many functions of interest are computable by machines that always halt. A machine that uses only finite memory on any particular input can be forced to halt for every input by restricting its flow control capabilities so that no input will ever cause the machine to enter an infinite loop. As a trivial example, a machine impleme ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
ST-connectivity
In computer science, st-connectivity or STCON is a decision problem asking, for vertices ''s'' and ''t'' in a directed graph, if ''t'' is reachable from ''s''. Formally, the decision problem is given by :. Complexity On a sequential computer, st-connectivity can easily be solved in linear time by either depth-first search or breadth-first search. The interest in this problem in computational complexity concerns its complexity with respect to more limited forms of computation. For instance, the complexity class of problems that can be solved by a non-deterministic Turing machine using only a logarithmic amount of memory is called NL. The st-connectivity problem can be shown to be in NL, as a non-deterministic Turing machine can guess the next node of the path, while the only information which has to be stored is the total length of the path and which node is currently under consideration. The algorithm terminates if either the target node ''t'' is reached, or the length of th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Sipser, Michael
Michael Fredric Sipser (born September 17, 1954) is an American theoretical computer scientist who has made early contributions to computational complexity theory. He is a professor of applied mathematics and was the Dean of Science at the Massachusetts Institute of Technology. Biography Sipser was born and raised in Brooklyn, New York and moved to Oswego, New York when he was 12 years old. He earned his BA in mathematics from Cornell University in 1974 and his PhD in engineering from the University of California at Berkeley in 1980 under the direction of Manuel Blum.Trafton, Anne"Michael Sipser named dean of the School of Science: Sipser has served as interim dean since Marc Kastner’s departure" MIT News Office, June 5, 2014 He joined MIT's Laboratory for Computer Science as a research associate in 1979 and then was a Research Staff Member at IBM Research in San Jose. In 1980, he joined the MIT faculty. He spent the 1985-1986 academic year on the faculty of the Uni ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Cengage Learning
Cengage Group is an American educational content, technology, and services company for the higher education, K-12, professional, and library markets. It operates in more than 20 countries around the world.(Jun 27, 2014Global Publishing Leaders 2014: Cengage publishersweekly.comCompany Info - Wall Street JournalCengage LearningCompany Overview of Cengage Learning, Inc. BloombergBusiness Company information The company is headquartered in Boston, Massachusetts, and has approximately 5,000 employees worldwide across nearly 38 countries. It was headquartered at its Stamford, Connecticut, office until April 2014.[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |