Min-entropy
   HOME
*





Min-entropy
The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the ''most likely'' outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways. The min-entropy is never greater than the ordinary or Shannon entropy (which measures the average unpredictability of the outcomes) and that in turn is never greater than the Hartley or max-entropy, defined as the logarithm of the ''number'' of outcomes with nonzero probability. As with the classical Shannon entropy and its quantum generalization, the von Neumann entropy, one can define a conditional version of min-entropy. The conditional quantum min-entropy is a one-shot, or conservative, analog of conditional quantum entropy. To interpret a conditional informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Rényi Entropy
In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of can be calculated explicitly because it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors. Definition The Rényi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Generalized Relative Entropy
Generalized relative entropy (\epsilon-relative entropy) is a measure of dissimilarity between two quantum states. It is a "one-shot" analogue of quantum relative entropy and shares many properties of the latter quantity. In the study of quantum information theory, we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure, von Neumann entropy, is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements. \epsilon-relative entropy is one such particularly interesting measure. In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an impo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Max-entropy
The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function : H_0(A) := \mathrm_b \vert A \vert , where denotes the cardinality of ''A''. If the base of the logarithm is 2, then the unit of uncertainty is the shannon (more commonly known as bit). If it is the natural logarithm, then the unit is the nat. Hartley used a base-ten logarithm, and with this base, the unit of information is called the hartley (aka ban or dit) in his honor. It is also known as the Hartley entropy. Hartley function, Shannon entropy, and Rényi entropy The Hartley function coincides with the Shannon entropy (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is a special case of the Rényi entropy since: :H_0(X) = \frac 1 \log \sum_^ p_i^0 = \log , X, . But it can also ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include sourc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Shannon Entropy
Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Williams (born 1998) * Shannon, intermittent stage name of English singer-songwriter Marty Wilde (born 1939) * Claude Shannon (1916-2001) was American mathematician, electrical engineer, and cryptographer known as a "father of information theory" Places Australia * Shannon, Tasmania, a locality * Hundred of Shannon, a cadastral unit in South Australia * Shannon, a former name for the area named Calomba, South Australia since 1916 * Shannon River (Western Australia) Canada * Shannon, New Brunswick, a community * Shannon, Quebec, a city * Shannon Bay, former name of Darrell Bay, British Columbia * Shannon Falls, a waterfall in British Columbia Ireland * River Shannon, the longest river in Ireland ** Shannon Cave, a subterranean section ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Von Neumann Entropy
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix , the von Neumann entropy is : S = - \operatorname(\rho \ln \rho), where \operatorname denotes the trace and ln denotes the (natural) matrix logarithm. If is written in terms of its eigenvectors , 1\rangle, , 2\rangle, , 3\rangle, \dots as : \rho = \sum_j \eta_j \left, j \right\rang \left\lang j \ , then the von Neumann entropy is merely : S = -\sum_j \eta_j \ln \eta_j . In this form, ''S'' can be seen as the information theoretic Shannon entropy. The von Neumann entropy is also used in different forms ( conditional entropies, relative entropies, etc.) in the framework of quantum information theory to characterize the entropy of entanglement. Background John von Neumann established a rigorous mathematical framework for quantum ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Conditional Quantum Entropy
The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. For a bipartite state \rho^, the conditional entropy is written S(A, B)_\rho, or H(A, B)_\rho, depending on the notation being used for the von Neumann entropy. The quantum conditional entropy was defined in terms of a conditional density operator \rho_ by Nicolas Cerf and Chris Adami, who showed that quantum conditional entropies can be negative, something that is forbidden in classical physics. The negativity of quantum conditional entropy is a sufficient criterion for quantum non-separability. In what follows, we use the notation S(\cdot) for the von Neumann entropy, which will simply be called "entropy". Definition Given a bipartite quantum state \rho^, the entropy of the joint system AB is S(AB)_\rho \ \stackrel\ S(\rho^), and the entropies of the subsystems are S(A)_\rho \ \stackrel\ S(\rh ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Fidelity Of Quantum States
In quantum mechanics, notably in quantum information theory, fidelity is a measure of the "closeness" of two quantum states. It expresses the probability that one state will pass a test to identify as the other. The fidelity is not a metric on the space of density matrices, but it can be used to define the Bures metric on this space. Given two density operators \rho and \sigma, the fidelity is generally defined as the quantity F(\rho, \sigma) = \left(\operatorname \sqrt\right)^2. In the special case where \rho and \sigma represent pure quantum states, namely, \rho=, \psi_\rho\rangle\!\langle\psi_\rho, and \sigma=, \psi_\sigma\rangle\!\langle\psi_\sigma, , the definition reduces to the squared overlap between the states: F(\rho, \sigma)=, \langle\psi_\rho, \psi_\sigma\rangle, ^2. While not obvious from the general definition, the fidelity is symmetric: F(\rho,\sigma)=F(\sigma,\rho). Motivation Given two random variables X,Y with values (1, ..., n) ( categorical random varia ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Semidefinite Programming
Semidefinite programming (SDP) is a subfield of convex optimization concerned with the optimization of a linear objective function (a user-specified function that the user wants to minimize or maximize) over the intersection of the cone of positive semidefinite matrices with an affine space, i.e., a spectrahedron. Semidefinite programming is a relatively new field of optimization which is of growing interest for several reasons. Many practical problems in operations research and combinatorial optimization can be modeled or approximated as semidefinite programming problems. In automatic control theory, SDPs are used in the context of linear matrix inequalities. SDPs are in fact a special case of cone programming and can be efficiently solved by interior point methods. All linear programs and (convex) quadratic programs can be expressed as SDPs, and via hierarchies of SDPs the solutions of polynomial optimization problems can be approximated. Semidefinite programming has been u ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Choi–Jamiołkowski Isomorphism
In quantum information theory and operator theory, the Choi–Jamiołkowski isomorphism refers to the correspondence between quantum channels (described by completely positive map In mathematics a positive map is a map between C*-algebras that sends positive elements to positive elements. A completely positive map is one which satisfies a stronger, more robust condition. Definition Let A and B be C*-algebras. A linear ...s) and quantum states (described by density matrices), this is introduced by Man-Duen Choi and Andrzej Jamiołkowski. It is also called channel-state duality by some authors in the quantum information area, but mathematically, this is a more general correspondence between positive operators and the complete positive superoperators. Definition To study a quantum channel \mathcal from system S to S', which is a trace-preserving completely positive map from operator spaces \mathcal(\mathcal_S) to \mathcal(\mathcal_), we introduce an auxiliary system A w ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]