HOME





Quantum Mutual Information
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information. Motivation For simplicity, it will be assumed that all objects in the article are finite-dimensional. The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are :p(x) = \sum_ p(x,y), \qquad p(y) = \sum_ p(x,y). The classical mutual information ''I''(''X'':''Y'') is defined by :I(X:Y) = S(p(x)) + S(p(y)) - S(p(x,y)) where ''S''(''q'') denotes the Shannon entropy of the probability distribution ''q''. One can calculate directly :\begin S(p(x)) + S(p(y)) &= - \left (\sum_x p_x \log p(x) + \sum_y p_y \log p(y) \right ) \\ &= -\left (\sum_x \left ( \sum_ p(x,y') \log \sum_ p(x,y') \right ) + \sum_y \left ( \ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Information Theory
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term. It is an interdisciplinary field that involves quantum mechanics, computer science, information theory, philosophy and cryptography among other fields. Its study is also relevant to disciplines such as cognitive science, psychology and neuroscience. Its main focus is in extracting information from matter at the microscopic scale. Observation in science is one of the most important ways of acquiring information and measurement is required in order to quantify the observation, making this crucial to the scientific method. In quantum mechanics, due to the uncertainty principle, non-commuting observables cannot be precisely measured simultaneously, as ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

John Von Neumann
John von Neumann ( ; ; December 28, 1903 – February 8, 1957) was a Hungarian and American mathematician, physicist, computer scientist and engineer. Von Neumann had perhaps the widest coverage of any mathematician of his time, integrating Basic research, pure and Applied science#Applied research, applied sciences and making major contributions to many fields, including mathematics, physics, economics, computing, and statistics. He was a pioneer in building the mathematical framework of quantum physics, in the development of functional analysis, and in game theory, introducing or codifying concepts including Cellular automaton, cellular automata, the Von Neumann universal constructor, universal constructor and the Computer, digital computer. His analysis of the structure of self-replication preceded the discovery of the structure of DNA. During World War II, von Neumann worked on the Manhattan Project. He developed the mathematical models behind the explosive lense ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist, cryptographer and inventor known as the "father of information theory" and the man who laid the foundations of the Information Age. Shannon was the first to describe the use of Boolean algebra—essential to all digital electronic circuits—and helped found artificial intelligence (AI). Roboticist Rodney Brooks declared Shannon the 20th century engineer who contributed the most to 21st century technologies, and mathematician Solomon W. Golomb described his intellectual achievement as "one of the greatest of the twentieth century". At the University of Michigan, Shannon dual degreed, graduating with a Bachelor of Science in electrical engineering and another in mathematics, both in 1936. A 21-year-old master's degree student in electrical engineering at MIT, his thesis, "A Symbolic Analysis of Relay and Switching Circuits", demonstrated that electric ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information content, amount of information" (in Units of information, units such as shannon (unit), shannons (bits), Nat (unit), nats or Hartley (unit), hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of Entropy (information theory), entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable. Not limited to real-valued random variables and linear dependence like the Pearson correlation coefficient, correlation coefficient, MI is more general and determines how different the joint distribution of the pair (X,Y) is from the product of the marginal distributions of X and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Shannon Entropy
Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Williams (born 1998) * Shannon, intermittent stage name of English singer-songwriter Marty Wilde (born 1939) Places Australia * Shannon, Tasmania, a locality * Hundred of Shannon, a cadastral unit in South Australia * Shannon, a former name for the area named Calomba, South Australia since 1916 * Shannon River (Western Australia) * Shannon, Western Australia, a locality in the Shire of Manjimup * Shannon National Park, a national park in Western Australia Canada * Shannon, New Brunswick, a community * Shannon, Quebec, a city * Shannon Bay, former name of Darrell Bay, British Columbia * Shannon Falls, a waterfall in British Columbia Ireland * River Shannon, the longest river in Ireland ** Shannon Cave, a subterranean section o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Relative Entropy
Relative may refer to: General use *Kinship and family, the principle binding the most basic social units of society. If two people are connected by circumstances of birth, they are said to be ''relatives''. Philosophy *Relativism, the concept that points of view have no absolute truth or validity, having only relative, subjective value according to differences in perception and consideration, or relatively, as in the relative value of an object to a person * Relative value (philosophy) Economics * Relative value (economics) Popular culture Film and television * ''Relatively Speaking'' (1965 play), 1965 British play * ''Relatively Speaking'' (game show), late 1980s television game show * ''Everything's Relative'' (episode)#Yu-Gi-Oh! (Yu-Gi-Oh! Duel Monsters), 2000 Japanese anime ''Yu-Gi-Oh! Duel Monsters'' episode *'' Relative Values'', 2000 film based on the play of the same name. *'' It's All Relative'', 2003-4 comedy television series *''Intelligence is Relative'', tag lin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Density Matrix
In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while those can only represent pure states, density matrices can also represent mixed states. These arise in quantum mechanics in two different situations: # when the preparation of a system can randomly produce different pure states, and thus one must deal with the statistics of possible preparations, and # when one wants to describe a physical system that is entangled with another, without describing their combined state. This case is typical for a system interacting with some environment (e.g. decoherence). In this case, the density matrix of an entangled system differs from that of an ensemble of pure states that, combined, would give the same statistical results upon measurement. Density matrices are thus crucial tools in areas of quantum ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Tensor Product Of Hilbert Spaces
In mathematics, and in particular functional analysis, the tensor product of Hilbert spaces is a way to extend the tensor product construction so that the result of taking a tensor product of two Hilbert spaces is another Hilbert space. Roughly speaking, the tensor product is the metric space completion of the ordinary tensor product. This is an example of a topological tensor product. The tensor product allows Hilbert spaces to be collected into a symmetric monoidal category. Definition Since Hilbert spaces have inner products, one would like to introduce an inner product, and thereby a topology, on the tensor product that arises naturally from the inner products on the factors. Let H_1 and H_2 be two Hilbert spaces with inner products \langle\cdot, \cdot\rangle_1 and \langle\cdot, \cdot\rangle_2, respectively. Construct the tensor product of H_1 and H_2 as vector spaces as explained in the article on tensor products. We can turn this vector space tensor product into an inner pr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Von Neumann Entropy
In physics, the von Neumann entropy, named after John von Neumann, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics, and it is the quantum counterpart of the Shannon entropy from classical information theory. For a quantum-mechanical system described by a density matrix , the von Neumann entropy is S = - \operatorname(\rho \ln \rho), where \operatorname denotes the trace and \operatorname denotes the matrix version of the natural logarithm. If the density matrix is written in a basis of its eigenvectors , 1\rangle, , 2\rangle, , 3\rangle, \dots as \rho = \sum_j \eta_j \left, j \right\rang \left\lang j \ , then the von Neumann entropy is merely S = -\sum_j \eta_j \ln \eta_j . In this form, ''S'' can be seen as the Shannon entropy of the eigenvalues, reinterpreted as probabilities. The von Neumann entropy and quantities based upon i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Partial Trace
In linear algebra and functional analysis, the partial trace is a generalization of the trace (linear algebra), trace. Whereas the trace is a scalar (mathematics), scalar-valued function on operators, the partial trace is an operator (mathematics), operator-valued function. The partial trace has applications in quantum information and decoherence which is relevant for quantum measurement and thereby to the decoherent approaches to interpretations of quantum mechanics, including consistent histories and the relative state interpretation. Details Suppose V, W are finite-dimensional vector spaces over a field (mathematics), field, with dimensions m and n, respectively. For any space , let L(A) denote the space of linear operators on A. The partial trace over W is then written as , where \otimes denotes the Kronecker product. It is defined as follows: For , let , and , be bases for ''V'' and ''W'' respectively; then ''T'' has a matrix representation : \ \quad 1 \leq k, i \leq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Relative Entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the article are finite-dimensional. We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is :\; - \log q_j. The (assumed) average uncertainty of all possible events is then :\; - \sum_j p_j \log q_j. On the other hand, the Shannon entropy of the probability distribution ''p'', defined by :\; - \sum_j p_j \log p_j, is the real amount of uncertainty before observation ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Quantum Discord
In quantum information theory, quantum discord is a measure of nonclassical correlations between two subsystems of a quantum system. It includes correlations that are due to quantum mechanics, quantum physical effects but do not necessarily involve quantum entanglement. The notion of quantum discord was introduced by Harold Ollivier and Wojciech H. ZurekWojciech H. Zurek, ''Einselection and decoherence from an information theory perspective'', Annalen der Physik vol. 9, 855–864 (2000abstract/ref>Harold Ollivier and Wojciech H. Zurek, ''Quantum Discord: A Measure of the Quantumness of Correlations'', Physical Review Letters vol. 88, 017901 (2001abstract/ref> and, independently by Leah Henderson and Vlatko Vedral. Olliver and Zurek referred to it also as a measure of ''quantumness'' of correlations. From the work of these two research groups it follows that quantum correlations can be present in certain mixed separable states;Paolo Giorda, Matteo G. A. Paris: ''Gaussian q ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]