HOME

TheInfoList



OR:

The joint quantum entropy generalizes the classical
joint entropy In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Definition The joint Shannon entropy (in bits) of two discrete random variables X and Y with images \mathcal X and \mathcal Y is define ...
to the context of
quantum information theory Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
. Intuitively, given two
quantum state In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system ...
s \rho and \sigma, represented as
density operator In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while thos ...
s that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
of the joint system. It is written S(\rho,\sigma) or H(\rho,\sigma), depending on the notation being used for the
von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of Gibbs entropy from classical statistical mechanics to quantum statis ...
. Like other entropies, the joint quantum entropy is measured in
bit The bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as ...
s, i.e. the logarithm is taken in base 2. In this article, we will use S(\rho,\sigma) for the joint quantum entropy.


Background

In
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, for any classical
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
X, the classical
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
H(X) is a measure of how uncertain we are about the outcome of X. For example, if X is a probability distribution concentrated at one point, the outcome of X is certain and therefore its entropy H(X)=0. At the other extreme, if X is the uniform probability distribution with n possible values, intuitively one would expect X is associated with the most uncertainty. Indeed, such uniform probability distributions have maximum possible entropy H(X) = \log_2(n). In
quantum information theory Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
, the notion of entropy is extended from probability distributions to quantum states, or
density matrices In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while th ...
. For a state \rho, the
von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of Gibbs entropy from classical statistical mechanics to quantum statis ...
is defined by :- \operatorname \rho \log \rho. Applying the
spectral theorem In linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involvin ...
, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a
pure state In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system re ...
, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy S(\rho) (or sometimes H(\rho).


Definition

Given a quantum system with two subsystems ''A'' and ''B'', the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems. In symbols, if the combined system is in state \rho^, the joint quantum entropy is then :S(\rho^A,\rho^B) = S(\rho^) = -\operatorname(\rho^\log(\rho^)). Each subsystem has its own entropy. The state of the subsystems are given by the
partial trace In linear algebra and functional analysis, the partial trace is a generalization of the trace (linear algebra), trace. Whereas the trace is a scalar (mathematics), scalar-valued function on operators, the partial trace is an operator (mathemati ...
operation.


Properties

The classical joint entropy is always at least equal to the entropy of each individual system. This is not the case for the joint quantum entropy. If the quantum state \rho^ exhibits
quantum entanglement Quantum entanglement is the phenomenon where the quantum state of each Subatomic particle, particle in a group cannot be described independently of the state of the others, even when the particles are separated by a large distance. The topic o ...
, then the entropy of each subsystem may be larger than the joint entropy. This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be. Consider a
maximally entangled state Quantum entanglement is the phenomenon where the quantum state of each Subatomic particle, particle in a group cannot be described independently of the state of the others, even when the particles are separated by a large distance. The topic o ...
such as a
Bell state In quantum information science, the Bell's states or EPR pairs are specific quantum states of two qubits that represent the simplest examples of quantum entanglement. The Bell's states are a form of entangled and normalized basis vectors. Thi ...
. If \rho^ is a Bell state, say, :\left, \Psi \right\rangle = \frac\left(, 00\rangle + , 11\rangle\right), then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy \log 2 = 1. Thus the joint entropy of the combined system is less than that of subsystems. This is because for entangled states, definite states cannot be assigned to subsystems, resulting in positive entropy. Notice that the above phenomenon cannot occur if a state is a separable pure state. In that case, the reduced states of the subsystems are also pure. Therefore, all entropies are zero.


Relations to other entropy measures

The joint quantum entropy S(\rho^) can be used to define of the
conditional quantum entropy The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. For a bipartite state \rho^, the conditional entropy is written S(A, B) ...
: :S(\rho^A, \rho^B) \ \stackrel\ S(\rho^A,\rho^B) - S(\rho^B) and the
quantum mutual information In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informat ...
: :I(\rho^A:\rho^B) \ \stackrel\ S(\rho^A) + S(\rho^B) - S(\rho^A,\rho^B) These definitions parallel the use of the classical
joint entropy In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Definition The joint Shannon entropy (in bits) of two discrete random variables X and Y with images \mathcal X and \mathcal Y is define ...
to define the
conditional entropy In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y given that the value of another random variable X is known. Here, information is measured in shannons, n ...
and
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
.


See also

*
Quantum relative entropy In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in ...
*
Quantum mutual information In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual informat ...


References

* Nielsen, Michael A. and Isaac L. Chuang, ''Quantum Computation and Quantum Information''. Cambridge University Press, 2000. {{DEFAULTSORT:Joint Quantum Entropy Quantum mechanical entropy Quantum information theory