HOME

TheInfoList



OR:

In
physics Physics is the scientific study of matter, its Elementary particle, fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge whi ...
, the von Neumann entropy, named after
John von Neumann John von Neumann ( ; ; December 28, 1903 – February 8, 1957) was a Hungarian and American mathematician, physicist, computer scientist and engineer. Von Neumann had perhaps the widest coverage of any mathematician of his time, in ...
, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entrop ...
from classical
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
to
quantum statistical mechanics Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. It relies on constructing density matrices that describe quantum systems in thermal equilibrium. Its applications include the study of collections o ...
, and it is the quantum counterpart of the
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
from classical
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
. For a quantum-mechanical system described by a
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while th ...
, the von Neumann entropy is S = - \operatorname(\rho \ln \rho), where \operatorname denotes the trace and \operatorname denotes the matrix version of the
natural logarithm The natural logarithm of a number is its logarithm to the base of a logarithm, base of the e (mathematical constant), mathematical constant , which is an Irrational number, irrational and Transcendental number, transcendental number approxima ...
. If the density matrix is written in a basis of its
eigenvectors In linear algebra, an eigenvector ( ) or characteristic vector is a Vector (mathematics and physics), vector that has its direction (geometry), direction unchanged (or reversed) by a given linear map, linear transformation. More precisely, an e ...
, 1\rangle, , 2\rangle, , 3\rangle, \dots as \rho = \sum_j \eta_j \left, j \right\rang \left\lang j \ , then the von Neumann entropy is merely S = -\sum_j \eta_j \ln \eta_j . In this form, ''S'' can be seen as the Shannon entropy of the eigenvalues, reinterpreted as probabilities. The von Neumann entropy and quantities based upon it are widely used in the study of
quantum entanglement Quantum entanglement is the phenomenon where the quantum state of each Subatomic particle, particle in a group cannot be described independently of the state of the others, even when the particles are separated by a large distance. The topic o ...
.


Fundamentals

In quantum mechanics, probabilities for the outcomes of experiments made upon a system are calculated from the
quantum state In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system ...
describing that system. Each physical system is associated with a
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
, or more specifically a
Hilbert space In mathematics, a Hilbert space is a real number, real or complex number, complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space. The ...
. The
dimension In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
of the Hilbert space may be infinite, as it is for the space of
square-integrable function In mathematics, a square-integrable function, also called a quadratically integrable function or L^2 function or square-summable function, is a real- or complex-valued measurable function for which the integral of the square of the absolute value ...
s on a line, which is used to define the quantum physics of a continuous degree of freedom. Alternatively, the Hilbert space may be finite-dimensional, as occurs for spin degrees of freedom. A density operator, the mathematical representation of a quantum state, is a positive semi-definite,
self-adjoint operator In mathematics, a self-adjoint operator on a complex vector space ''V'' with inner product \langle\cdot,\cdot\rangle is a linear map ''A'' (from ''V'' to itself) that is its own adjoint. That is, \langle Ax,y \rangle = \langle x,Ay \rangle for al ...
of trace one acting on the Hilbert space of the system. A density operator that is a rank-1 projection is known as a ''pure'' quantum state, and all quantum states that are not pure are designated ''mixed''. Pure states are also known as ''wavefunctions''. Assigning a pure state to a quantum system implies certainty about the outcome of some measurement on that system (i.e., P(x) = 1 for some outcome x). The
state space In computer science, a state space is a discrete space representing the set of all possible configurations of a system. It is a useful abstraction for reasoning about the behavior of a given system and is widely used in the fields of artificial ...
of a quantum system is the set of all states, pure and mixed, that can be assigned to it. For any system, the state space is a
convex set In geometry, a set of points is convex if it contains every line segment between two points in the set. For example, a solid cube (geometry), cube is a convex set, but anything that is hollow or has an indent, for example, a crescent shape, is n ...
: Any mixed state can be written as a
convex combination In convex geometry and Vector space, vector algebra, a convex combination is a linear combination of point (geometry), points (which can be vector (geometric), vectors, scalar (mathematics), scalars, or more generally points in an affine sp ...
of pure states, though not in a unique way. The von Neumann entropy quantifies the extent to which a state is mixed. The prototypical example of a finite-dimensional Hilbert space is a
qubit In quantum computing, a qubit () or quantum bit is a basic unit of quantum information—the quantum version of the classic binary bit physically realized with a two-state device. A qubit is a two-state (or two-level) quantum-mechanical syste ...
, a quantum system whose Hilbert space is 2-dimensional. An arbitrary state for a qubit can be written as a linear combination of the
Pauli matrices In mathematical physics and mathematics, the Pauli matrices are a set of three complex matrices that are traceless, Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma (), they are occasionally denoted by tau () ...
, which provide a basis for 2 \times 2 self-adjoint matrices: \rho = \tfrac\left(I + r_x \sigma_x + r_y \sigma_y + r_z \sigma_z\right), where the real numbers (r_x, r_y, r_z) are the coordinates of a point within the
unit ball Unit may refer to: General measurement * Unit of measurement, a definite magnitude of a physical quantity, defined and adopted by convention or by law **International System of Units (SI), modern form of the metric system **English units, histo ...
and \sigma_x = \begin 0&1\\ 1&0 \end, \quad \sigma_y = \begin 0&-i\\ i&0 \end, \quad \sigma_z = \begin 1&0\\ 0&-1 \end . The von Neumann entropy vanishes when \rho is a pure state, i.e., when the point (r_x, r_y, r_z) lies on the surface of the unit ball, and it attains its maximum value when \rho is the ''maximally mixed'' state, which is given by r_x = r_y = r_z = 0.


Properties

Some properties of the von Neumann entropy: * is zero if and only if represents a pure state. * is maximal and equal to \ln N for a maximally mixed state, being the dimension of the
Hilbert space In mathematics, a Hilbert space is a real number, real or complex number, complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space. The ...
. * is invariant under changes in the basis of , that is, , with a unitary transformation. * is concave, that is, given a collection of positive numbers which sum to unity (\Sigma_i \lambda_i = 1) and density operators , we have S\bigg(\sum_^k \lambda_i \rho_i \bigg) \geq \sum_^k \lambda_i S(\rho_i). * is additive for independent systems. Given two density matrices describing independent systems ''A'' and ''B'', we have S(\rho_A \otimes \rho_B)=S(\rho_A)+S(\rho_B). * is ''strongly subadditive.'' That is, for any three systems ''A'', ''B'', and ''C'': S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_). :This automatically means that is subadditive: S(\rho_) \leq S(\rho_) +S(\rho_). Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.


Subadditivity

If are the reduced density matrices of the general state , then \left, S(\rho_A) - S(\rho_B) \ \leq S(\rho_) \leq S(\rho_A) + S(\rho_B) . The right hand inequality is known as ''
subadditivity In mathematics, subadditivity is a property of a function that states, roughly, that evaluating the function for the sum of two elements of the domain always returns something less than or equal to the sum of the function's values at each element ...
,'' and the left is sometimes known as the ''
triangle inequality In mathematics, the triangle inequality states that for any triangle, the sum of the lengths of any two sides must be greater than or equal to the length of the remaining side. This statement permits the inclusion of Degeneracy (mathematics)#T ...
''. While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case; i.e., it is possible that , while . This is expressed by saying that the Shannon entropy is ''monotonic'' but the von Neumann entropy is not. For example, take the
Bell state In quantum information science, the Bell's states or EPR pairs are specific quantum states of two qubits that represent the simplest examples of quantum entanglement. The Bell's states are a form of entangled and normalized basis vectors. Thi ...
of two
spin-1/2 In quantum mechanics, spin is an intrinsic property of all elementary particles. All known fermions, the particles that constitute ordinary matter, have a spin of . The spin number describes how many symmetrical facets a particle has in one f ...
particles: \left, \psi \right\rangle = \left, \uparrow \downarrow \right\rangle + \left, \downarrow \uparrow \right\rangle . This is a pure state with zero entropy, but each spin has maximum entropy when considered individually, because its reduced density matrix is the maximally mixed state. This indicates that it is an ''entangled'' state; the use of entropy as an entanglement measure is discussed further below.


Strong subadditivity

The von Neumann entropy is also '' strongly subadditive''. Given three
Hilbert space In mathematics, a Hilbert space is a real number, real or complex number, complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space. The ...
s, ''A'', ''B'', ''C'', S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_). By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality: S(\rho_) + S(\rho_) \leq S(\rho_) + S(\rho_) where , etc. are the reduced density matrices of a density matrix . If we apply ordinary subadditivity to the left side of this inequality, we then find S(\rho_) \leq S(\rho_) + S(\rho_). By symmetry, for any tripartite state , each of the three numbers is less than or equal to the sum of the other two.


Minimum Shannon entropy

Given a quantum state and a specification of a quantum measurement, we can calculate the probabilities for the different possible results of that measurement, and thus we can find the Shannon entropy of that probability distribution. A quantum measurement can be specified mathematically as a positive operator valued measure, or POVM. In the simplest case, a system with a finite-dimensional Hilbert space and measurement with a finite number of outcomes, a POVM is a set of positive semi-definite
matrices Matrix (: matrices or matrixes) or MATRIX may refer to: Science and mathematics * Matrix (mathematics), a rectangular array of numbers, symbols or expressions * Matrix (logic), part of a formula in prenex normal form * Matrix (biology), the ...
\ on the Hilbert space that sum to the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the obje ...
, \sum_^n F_i = \operatorname. The POVM element F_i is associated with the measurement outcome i, such that the probability of obtaining it when making a measurement on the
quantum state In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system ...
\rho is given by \text(i) = \operatorname(\rho F_i). A POVM is ''rank-1'' if all of the elements are proportional to rank-1 projection operators. The von Neumann entropy is the minimum achievable Shannon entropy, where the minimization is taken over all rank-1 POVMs.


Holevo χ quantity

If are density operators and is a collection of positive numbers which sum to unity (\Sigma_i \lambda_i = 1), then \rho = \sum_^k \lambda_i \rho_i is a valid density operator, and the difference between its von Neumann entropy and the weighted average of the entropies of the is bounded by the ''Shannon'' entropy of the : S\bigg(\sum_^k \lambda_i \rho_i \bigg) - \sum_^k \lambda_i S(\rho_i) \leq -\sum_^k \lambda_i \log \lambda_i. Equality is attained when the ''supports'' of the – the spaces spanned by their eigenvectors corresponding to nonzero eigenvalues – are orthogonal. The difference on the left-hand side of this inequality is known as the Holevo χ quantity and also appears in Holevo's theorem, an important result in
quantum information theory Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
.


Change under time evolution


Unitary

The time evolution of an isolated system is described by a unitary operator: \rho \to U \rho U^\dagger. Unitary evolution takes pure states into pure states, and it leaves the von Neumann entropy unchanged. This follows from the fact that the entropy of \rho is a function of the eigenvalues of \rho.


Measurement

A measurement upon a quantum system will generally bring about a change of the quantum state of that system. Writing a POVM does not provide the complete information necessary to describe this state-change process. To remedy this, further information is specified by decomposing each POVM element into a product: E_i = A^\dagger_ A_. The Kraus operators A_, named for Karl Kraus, provide a specification of the state-change process. They are not necessarily self-adjoint, but the products A^\dagger_ A_ are. If upon performing the measurement the outcome E_i is obtained, then the initial state \rho is updated to \rho \to \rho' = \frac = \frac. An important special case is the Lüders rule, named for Gerhart Lüders. If the POVM elements are projection operators, then the Kraus operators can be taken to be the projectors themselves: \rho \to \rho' = \frac. If the initial state \rho is pure, and the projectors \Pi_i have rank 1, they can be written as projectors onto the vectors , \psi\rangle and , i\rangle, respectively. The formula simplifies thus to \rho = , \psi\rangle\langle\psi, \to \rho' = \frac = , i\rangle\langle i, . We can define a linear, trace-preserving, completely positive map, by summing over all the possible post-measurement states of a POVM without the normalisation: \rho \to \sum_i A_i \rho A^\dagger_i. It is an example of a quantum channel, and can be interpreted as expressing how a quantum state changes if a measurement is performed but the result of that measurement is lost. Channels defined by projective measurements can never decrease the von Neumann entropy; they leave the entropy unchanged only if they do not change the density matrix. A quantum channel will increase or leave constant the von Neumann entropy of every input state if and only if the channel is unital, i.e., if it leaves fixed the maximally mixed state. An example of a channel that decreases the von Neumann entropy is the
amplitude damping channel In the theory of quantum communication, an amplitude damping channel is a quantum channel that models physical processes such as spontaneous emission. A natural process by which this channel can occur is a spin chain through which a number of spin s ...
for a qubit, which sends all mixed states towards a pure state.


Thermodynamic meaning

The quantum version of the canonical distribution, the
Gibbs state In probability theory and statistical mechanics, a Gibbs state is an equilibrium probability distribution which remains invariant under future evolution of the system. For example, a stationary or steady-state distribution of a Markov chain, such ...
s, are found by maximizing the von Neumann entropy under the constraint that the expected value of the Hamiltonian is fixed. A Gibbs state is a density operator with the same eigenvectors as the Hamiltonian, and its eigenvalues are \lambda_i = \frac \exp\left(-\frac\right), where ''T'' is the temperature, k_B is the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative thermal energy of particles in a ideal gas, gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the ...
, and ''Z'' is the partition function. The von Neumann entropy of a Gibbs state is, up to a factor k_B, the thermodynamic entropy.


Generalizations and derived quantities


Conditional entropy

Let \rho_ be a joint state for the bipartite quantum system ''AB.'' Then the conditional von Neumann entropy S(A, B) is the difference between the entropy of \rho_ and the entropy of the marginal state for subsystem ''B'' alone: S(A, B) = S(\rho_) - S(\rho_B). This is bounded above by S(\rho_A). In other words, conditioning the description of subsystem ''A'' upon subsystem ''B'' cannot increase the entropy associated with ''A.'' Quantum mutual information can be defined as the difference between the entropy of the joint state and the total entropy of the marginals: S(A:B) = S(\rho_A) + S(\rho_B) - S(\rho_), which can also be expressed in terms of conditional entropy: S(A:B) = S(A) - S(A, B) = S(B) - S(B, A).


Relative entropy

Let \rho and \sigma be two density operators in the same state space. The relative entropy is defined to be S(\sigma, \rho) = \operatorname rho(\log \rho - \log\sigma) The relative entropy is always greater than or equal to zero; it equals zero if and only if \rho = \sigma. Unlike the von Neumann entropy itself, the relative entropy is monotonic, in that it decreases (or remains constant) when part of a system is traced over: S(\sigma_ , \rho_) \leq S(\sigma_ , \rho_).


Entanglement measures

Just as
energy Energy () is the physical quantity, quantitative physical property, property that is transferred to a physical body, body or to a physical system, recognizable in the performance of Work (thermodynamics), work and in the form of heat and l ...
is a resource that facilitates mechanical operations, entanglement is a resource that facilitates performing tasks that involve communication and computation. The mathematical definition of entanglement can be paraphrased as saying that maximal knowledge about the whole of a system does not imply maximal knowledge about the individual parts of that system. If the quantum state that describes a pair of particles is entangled, then the results of measurements upon one half of the pair can be strongly correlated with the results of measurements upon the other. However, entanglement is not the same as "correlation" as understood in classical probability theory and in daily life. Instead, entanglement can be thought of as ''potential'' correlation that can be used to generate actual correlation in an appropriate experiment. The state of a composite system is always expressible as a sum, or
superposition In mathematics, a linear combination or superposition is an expression constructed from a set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of ''x'' and ''y'' would be any expression of the form ...
, of products of states of local constituents; it is entangled if this sum cannot be written as a single product term. Entropy provides one tool that can be used to quantify entanglement. If the overall system is described by a pure state, the entropy of one subsystem can be used to measure its degree of entanglement with the other subsystems. For bipartite pure states, the von Neumann entropy of reduced states is the ''unique'' measure of entanglement in the sense that it is the only function on the family of states that satisfies certain axioms required of an entanglement measure. It is thus known as the ''entanglement entropy.'' It is a classical result that the Shannon entropy achieves its maximum at, and only at, the uniform probability distribution . Therefore, a bipartite pure state is said to be a ''maximally entangled state'' if the reduced state of each subsystem of is the diagonal matrix \begin \frac& & \\ & \ddots & \\ & & \frac\end. For mixed states, the reduced von Neumann entropy is not the only reasonable entanglement measure. Some of the other measures are also entropic in character. For example, the ''relative entropy of entanglement'' is given by minimizing the relative entropy between a given state \rho and the set of nonentangled, or ''separable,'' states. The entanglement of formation is defined by minimizing, over all possible ways of writing of \rho as a convex combination of pure states, the average entanglement entropy of those pure states. The squashed entanglement is based on the idea of extending a bipartite state \rho_ to a state describing a larger system, \rho_, such that the partial trace of \rho_ over ''E'' yields \rho_. One then finds the
infimum In mathematics, the infimum (abbreviated inf; : infima) of a subset S of a partially ordered set P is the greatest element in P that is less than or equal to each element of S, if such an element exists. If the infimum of S exists, it is unique ...
of the quantity \frac (\rho_) + S(\rho_) - S(\rho_E) - S(\rho_) over all possible choices of \rho_.


Quantum Rényi entropies

Just as the Shannon entropy function is one member of the broader family of classical Rényi entropies, so too can the von Neumann entropy be generalized to the quantum Rényi entropies: S_\alpha(\rho) = \frac \ln operatorname \rho^\alpha= \frac \ln \sum_^N \lambda_i^\alpha. In the limit that \alpha \to 1, this recovers the von Neumann entropy. The quantum Rényi entropies are all additive for product states, and for any \alpha, the Rényi entropy S_\alpha vanishes for pure states and is maximized by the maximally mixed state. For any given state \rho, S_\alpha(\rho) is a continuous, nonincreasing function of the parameter \alpha. A weak version of subadditivity can be proven: S_\alpha(\rho_A) - S_0(\rho_B) \leq S_\alpha(\rho_) \leq S_\alpha(\rho_A) + S_0(\rho_B). Here, S_0 is the quantum version of the Hartley entropy, i.e., the logarithm of the
rank A rank is a position in a hierarchy. It can be formally recognized—for example, cardinal, chief executive officer, general, professor—or unofficial. People Formal ranks * Academic rank * Corporate title * Diplomatic rank * Hierarchy ...
of the density matrix.


History

The
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while th ...
was introduced, with different motivations, by von Neumann and by
Lev Landau Lev Davidovich Landau (; 22 January 1908 – 1 April 1968) was a Soviet physicist who made fundamental contributions to many areas of theoretical physics. He was considered as one of the last scientists who were universally well-versed and ma ...
. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector. On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements. He introduced the expression now known as von Neumann entropy by arguing that a probabilistic combination of pure states is analogous to a mixture of ideal gases. Von Neumann first published on the topic in 1927. His argument was built upon earlier work by
Albert Einstein Albert Einstein (14 March 187918 April 1955) was a German-born theoretical physicist who is best known for developing the theory of relativity. Einstein also made important contributions to quantum mechanics. His mass–energy equivalence f ...
and Leo Szilard.
Max Delbrück Max Ludwig Henning Delbrück (; September 4, 1906 – March 9, 1981) was a German–American biophysicist who participated in launching the molecular biology research program in the late 1930s. He stimulated physical science, physical scientist ...
and Gert Molière proved the concavity and subadditivity properties of the von Neumann entropy in 1936. Quantum relative entropy was introduced by Hisaharu Umegaki in 1962. The subadditivity and triangle inequalities were proved in 1970 by Huzihiro Araki and
Elliott H. Lieb Elliott Hershel Lieb (born July 31, 1932) is an American mathematical physicist. He is a professor of mathematics and physics at Princeton University. Lieb's works pertain to quantum and classical many-body problem, atomic structure, the sta ...
. Strong subadditivity is a more difficult theorem. It was conjectured by Oscar Lanford and Derek Robinson in 1968. Lieb and Mary Beth Ruskai proved the theorem in 1973, Invited talk at the Conference in Honour of the 90th Birthday of Freeman Dyson, Institute of Advanced Studies, Nanyang Technological University, Singapore, 26–29 August 2013. using a matrix inequality proved earlier by Lieb.


References

* * * * * * * * * {{Statistical mechanics topics Quantum mechanical entropy John von Neumann