HOME
*





Effective Complexity
Effective complexity is a measure of complexity defined in a 1996 paper by Murray Gell-Mann and Seth Lloyd that attempts to measure the amount of non- random information in a system. It has been criticised as being dependent on the subjective decisions made as to which parts of the information in the system are to be discounted as random. See also * Kolmogorov complexity * Excess entropy * Logical depth * Renyi information * Self-dissimilarity * Forecasting complexity Forecasting complexity is a measure of complexity put forward (under the original name of) by the physicist Peter Grassberger. It was later renamed "statistical complexity" by James P. Crutchfield James P. Crutchfield (born 1955) is an American m ... References External links * http://www.cs.brandeis.edu/~pablo/complex.maker.html Information theory Computational complexity theory Measures of complexity {{Comp-sci-theory-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Complex Systems
A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations (like cities), an ecosystem, a living cell, and ultimately the entire universe. Complex systems are systems whose behavior is intrinsically difficult to model due to the dependencies, competitions, relationships, or other types of interactions between their parts or between a given system and its environment. Systems that are " complex" have distinct properties that arise from these relationships, such as nonlinearity, emergence, spontaneous order, adaptation, and feedback loops, among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their independent area of research. In many cas ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Murray Gell-Mann
Murray Gell-Mann (; September 15, 1929 – May 24, 2019) was an American physicist who received the 1969 Nobel Prize in Physics for his work on the theory of elementary particles. He was the Robert Andrews Millikan Professor of Theoretical Physics Emeritus at the California Institute of Technology, a distinguished fellow and one of the co-founders of the Santa Fe Institute, a professor of physics at the University of New Mexico, and the Presidential Professor of Physics and Medicine at the University of Southern California. Gell-Mann spent several periods at CERN, a nuclear research facility in Switzerland, among others as a John Simon Guggenheim Memorial Foundation fellow in 1972. Early life and education Gell-Mann was born in Lower Manhattan to a family of Jewish immigrants from the Austro-Hungarian Empire, specifically from Czernowitz in present-day Ukraine. His parents were Pauline (née Reichstein) and Arthur Isidore Gell-Mann, who taught English as a second language ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Seth Lloyd
Seth Lloyd (born August 2, 1960) is a professor of mechanical engineering and physics at the Massachusetts Institute of Technology. His research area is the interplay of information with complex systems, especially quantum systems. He has performed seminal work in the fields of quantum computation, quantum communication and quantum biology, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon's noisy channel theorem, and designing novel methods for quantum error correction and noise reduction. Biography Lloyd was born on August 2, 1960. He graduated from Phillips Academy in 1978 and received a bachelor of arts degree from Harvard College in 1982. He earned a certificate of advanced study in mathematics and a master of philosophy degree from Cambridge University in 1983 and 1984, while on a Marshall Scholarship. Lloyd was awarded a doctorate by Rockef ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random
In common usage, randomness is the apparent or actual lack of pattern or predictability in events. A random sequence of events, symbols or steps often has no order and does not follow an intelligible pattern or combination. Individual random events are, by definition, unpredictable, but if the probability distribution is known, the frequency of different outcomes over repeated events (or "trials") is predictable.Strictly speaking, the frequency of an outcome will converge almost surely to a predictable value as the number of trials becomes arbitrarily large. Non-convergence or convergence to a different value is possible, but has probability zero. For example, when throwing two dice, the outcome of any particular roll is unpredictable, but a sum of 7 will tend to occur twice as often as 4. In this view, randomness is not haphazardness; it is a measure of uncertainty of an outcome. Randomness applies to concepts of chance, probability, and information entropy. The fields of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Information
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level information pertains to the interpretation of that which may be sensed. Any natural process that is not completely random, and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artifacts such as analog signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form. Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation. Information is often processed iteratively: Data available at one step are processed into information to be interpreted and processed at the next step. For example, in written text each symbol or letter conveys information relevant to the word it is part of, each word conveys information relev ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Kolmogorov Complexity
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. It is named after Andrey Kolmogorov, who first published on the subject in 1963 and is a generalization of classical information theory. The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to Cantor's diagonal argument, Gödel's incompleteness theorem, and Turing's halting problem. In particular, no program ''P'' computing a lower bound for each text's Kolmogorov complexity can return a value essentially larger than ''P'''s own len ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Excess Entropy
In information theory, dual total correlation (Han 1978), information rate (Dubnov 2006), excess entropy (Olbrich 2008), or binding information (Abdallah and Plumbley 2010) is one of several known non-negative generalizations of mutual information. While total correlation is bounded by the sum entropies of the ''n'' elements, the dual total correlation is bounded by the joint-entropy of the ''n'' elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation (Ay 2001). Definition For a set of ''n'' random variables \, the dual total correlation D(X_1,\ldots,X_n) is given by : D(X_1,\ldots,X_n) = H\left( X_1, \ldots, X_n \right) - \sum_^n H\left( X_i \mid X_1, \ldots, X_, X_, \ldots, X_n \right) , where H(X_,\ldots,X_) is the joint entropy of the variable set \ and H(X_i \mid \cdots ) is the conditional entr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Logical Depth
Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm. Formally, in the context of some universal computer U the logical depth of a string x to significance level s is given by \text\, the running time of the fastest program that produces x and is no more than s longer than the minimal program. See also * Effective complexity * Self-dissimilarity * Forecasting complexity Forecasting complexity is a measure of complexity put forward (under the original name of) by the physicist Peter Grassberger. It was later renamed "statistical complexity" by James P. Crutchfield James P. Crutchfield (born 1955) is an American m ... * Sophistication (complexity theory) Referen ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Renyi Information
Renyi or Rényi may refer to: People *Alfréd Rényi (1921-1970), Hungarian mathematician *Tibor Rényi (1973-), Hungarian painter *Tom Renyi (1947-), American banker and businessman Locations in China * Renyi, Rongchang County (仁义镇), town in Rongchang County Rongchang District () is a district of Chongqing Municipality, China, bordering Sichuan province to the west. The district, with a population of 800,000, is located in the west of Chongqing. Administration Climate Notable people * Shu Hon ..., Chongqing * Renyi, Hezhou (仁义镇), town in Babu District, Hezhou, Guangxi * Renyi, Guiyang County (仁义镇), town in Guiyang County, Hunan * Renyi, Leiyang (仁义镇), a town of Leiyang City, Hunan. {{disambig ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Self-dissimilarity
Self-dissimilarity is a measure of complexity defined in a series of papers by David Wolpert and William G. Macready. The degrees of self-dissimilarity between the patterns of a system observed at various scales (e.g. the average matter density of a physical body for volumes at different orders of magnitude) constitute a complexity "signature" of that system. See also *Diversity index *Index of dissimilarity *Jensen–Shannon divergence *Self-similarity *Similarity measure *Variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of number ... References Information theory Complex systems theory Measures of complexity {{math-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Forecasting Complexity
Forecasting complexity is a measure of complexity put forward (under the original name of) by the physicist Peter Grassberger. It was later renamed "statistical complexity" by James P. Crutchfield James P. Crutchfield (born 1955) is an American mathematician and physicist. He received his B.A. summa cum laude in physics and mathematics from the University of California, Santa Cruz, in 1979 and his Ph.D. in physics there in 1983. He is curren ... and Karl Young. References Measures of complexity {{applied-math-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]