Entropy In Thermodynamics And Information Theory
   HOME

TheInfoList



OR:

The mathematical expressions for thermodynamic
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
in the
statistical thermodynamics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
formulation established by
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of ther ...
and
J. Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
in the 1870s are similar to the
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
by
Claude Shannon Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory". As a 21-year-old master's degree student at the Massachusetts I ...
and
Ralph Hartley Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory. Biography Hartley wa ...
, developed in the 1940s.


Equivalence of form of the defining expressions

The defining expression for
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
in the theory of
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
established by
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of ther ...
and
J. Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
in the 1870s, is of the form: : S = - k_\text \sum_i p_i \ln p_i , where p_i is the probability of the
microstate A microstate or ministate is a sovereign state having a very small population or very small land area, usually both. However, the meanings of "state" and "very small" are not well-defined in international law.Warrington, E. (1994). "Lilliputs ...
''i'' taken from an equilibrium ensemble. The defining expression for
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
in the theory of
information Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level information pertains to the interpretation of that which may be sensed. Any natural process that is not completely random, ...
established by Claude E. Shannon in 1948 is of the form: : H = - \sum_i p_i \log_b p_i , where p_i is the probability of the message m_i taken from the message space ''M'', and ''b'' is the base of the
logarithm In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a number  to the base  is the exponent to which must be raised, to produce . For example, since , the ''logarithm base'' 10 ...
used. Common values of ''b'' are 2, Euler's number , and 10, and the unit of entropy is shannon (or
bit The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represente ...
) for ''b'' = 2,
nat Nat or NAT may refer to: Computing * Network address translation (NAT), in computer networking Organizations * National Actors Theatre, New York City, U.S. * National AIDS trust, a British charity * National Archives of Thailand * National A ...
for ''b'' = , and
hartley Hartley may refer to: Places Australia *Hartley, New South Wales * Hartley, South Australia ** Electoral district of Hartley, a state electoral district Canada *Hartley Bay, British Columbia United Kingdom * Hartley, Cumbria * Hartley, Pl ...
for ''b'' = 10. Mathematically ''H'' may also be seen as an average information, taken over the message space, because when a certain message occurs with probability ''p''''i'', the information quantity −log(''p''''i'') (called
information content In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative wa ...
or self-information) will be obtained. If all the microstates are equiprobable (a
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
), the statistical thermodynamic entropy reduces to the form, as given by Boltzmann, : S = k_\text \ln W , where ''W'' is the number of microstates that corresponds to the macroscopic thermodynamic state. Therefore S depends on temperature. If all the messages are equiprobable, the information entropy reduces to the
Hartley entropy The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function : H_0(A) ...
: H = \log_b , M, \ , where , M, is the
cardinality In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of the message space ''M''. The logarithm in the thermodynamic definition is the
natural logarithm The natural logarithm of a number is its logarithm to the base of the mathematical constant , which is an irrational and transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if ...
. It can be shown that the
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
formula, with the natural logarithm, reproduces all of the properties of the macroscopic
classical thermodynamics Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of ther ...
of
Rudolf Clausius Rudolf Julius Emanuel Clausius (; 2 January 1822 – 24 August 1888) was a German physicist and mathematician and is considered one of the central founding fathers of the science of thermodynamics. By his restatement of Sadi Carnot's princip ...
. (See article:
Entropy (statistical views) The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
). The
logarithm In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a number  to the base  is the exponent to which must be raised, to produce . For example, since , the ''logarithm base'' 10 ...
can also be taken to the natural base in the case of information entropy. This is equivalent to choosing to measure information in nats instead of the usual
bit The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represente ...
s (or more formally, shannons). In practice, information entropy is almost always calculated using base-2 logarithms, but this distinction amounts to nothing other than a change in units. One nat is about 1.44 shannons. For a simple compressible system that can only perform volume work, the
first law of thermodynamics The first law of thermodynamics is a formulation of the law of conservation of energy, adapted for thermodynamic processes. It distinguishes in principle two forms of energy transfer, heat and thermodynamic work for a system of a constant am ...
becomes : dE = -p dV + T dS . But one can equally well write this equation in terms of what physicists and chemists sometimes call the 'reduced' or dimensionless entropy, , so that : dE = -p dV + k_\text T d\sigma . Just as ''S'' is conjugate to ''T'', so ''σ'' is conjugate to ''k''B''T'' (the energy that is characteristic of ''T'' on a molecular scale). Thus the definitions of entropy in statistical mechanics (The Gibbs entropy formula S = -k_\sum_i p_i \log p_i) and in classical thermodynamics (d S = \frac, and the
fundamental thermodynamic relation In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentiall ...
) are equivalent for
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
, and statistical ensembles describing a thermodynamic system in equilibrium with a reservoir, such as the
canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the hea ...
,
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibriu ...
, isothermal–isobaric ensemble. This equivalence is commonly shown in textbooks. However, the equivalence between the thermodynamic definition of entropy and the Gibbs entropy is not general but instead an exclusive property of the generalized Boltzmann distribution. Furthermore, it has been shown that the definitions of entropy in statistical mechanics is the only entropy that is equivalent to the classical thermodynamics entropy under the following postulates:


Theoretical relationship

Despite the foregoing, there is a difference between the two quantities. The
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
''Η'' can be calculated for ''any''
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon ...
(if the "message" is taken to be that the event ''i'' which had probability ''pi'' occurred, out of the space of the events possible), while the
thermodynamic entropy In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-nineteenth century from the Greek word ...
''S'' refers to thermodynamic probabilities ''pi'' specifically. The difference is more theoretical than actual, however, because any probability distribution can be approximated arbitrarily closely by some thermodynamic system. Moreover, a direct connection can be made between the two. If the probabilities in question are the thermodynamic probabilities ''pi'': the (reduced)
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
''σ'' can then be seen as simply the amount of Shannon information needed to define the detailed microscopic state of the system, given its macroscopic description. Or, in the words of
G. N. Lewis Gilbert Newton Lewis (October 23 or October 25, 1875 – March 23, 1946) was an American physical chemistry, physical chemist and a Dean of the College of Chemistry at University of California, Berkeley. Lewis was best known for his discovery of ...
writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more". To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the average of the minimum number of yes–no questions needed to be answered in order to fully specify the
microstate A microstate or ministate is a sovereign state having a very small population or very small land area, usually both. However, the meanings of "state" and "very small" are not well-defined in international law.Warrington, E. (1994). "Lilliputs ...
, given that we know the macrostate. Furthermore, the prescription to find the equilibrium distributions of statistical mechanics—such as the Boltzmann distribution—by maximising the Gibbs entropy subject to appropriate constraints (the
Gibbs algorithm 200px, Josiah Willard Gibbs In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by ...
) can be seen as something not unique to thermodynamics, but as a principle of general relevance in statistical inference, if it is desired to find a maximally uninformative probability distribution, subject to certain constraints on its averages. (These perspectives are explored further in the article
Maximum entropy thermodynamics In physics, maximum entropy thermodynamics (colloquially, ''MaxEnt'' thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon inf ...
.) The Shannon entropy in information theory is sometimes expressed in units of bits per symbol. The physical entropy may be on a "per quantity" basis (''h'') which is called " intensive" entropy instead of the usual total entropy which is called "extensive" entropy. The "shannons" of a message (''Η'') are its total "extensive" information entropy and is ''h'' times the number of bits in the message. A direct and physically real relationship between ''h'' and ''S'' can be found by assigning a symbol to each microstate that occurs per mole, kilogram, volume, or particle of a homogeneous substance, then calculating the 'h' of these symbols. By theory or by observation, the symbols (microstates) will occur with different probabilities and this will determine ''h''. If there are N moles, kilograms, volumes, or particles of the unit substance, the relationship between ''h'' (in bits per unit substance) and physical extensive entropy in nats is: :S = k_\mathrm \ln(2) N h where ln(2) is the conversion factor from base 2 of Shannon entropy to the natural base e of physical entropy. ''N h'' is the amount of information in bits needed to describe the state of a physical system with entropy ''S''.
Landauer's principle Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of t ...
demonstrates the reality of this by stating the minimum energy ''E'' required (and therefore heat ''Q'' generated) by an ideally efficient memory change or logic operation by irreversibly erasing or merging ''N h'' bits of information will be ''S'' times the temperature which is :E = Q = T k_\mathrm \ln(2) N h , where ''h'' is in informational bits and ''E'' and ''Q'' are in physical Joules. This has been experimentally confirmed. Temperature is a measure of the average kinetic energy per particle in an ideal gas (kelvins = joules/''k''B) so the J/K units of ''k''B is dimensionless (joule/joule). ''k''b is the conversion factor from energy in  kelvins to joules for an ideal gas. If kinetic energy measurements per particle of an ideal gas were expressed as joules instead of kelvins, ''k''b in the above equations would be replaced by 3/2. This shows that ''S'' is a true statistical measure of microstates that does not have a fundamental physical unit other than the units of information, in this case nats, which is just a statement of which logarithm base was chosen by convention.


Information is physical


Szilard's engine

A physical
thought experiment A thought experiment is a hypothetical situation in which a hypothesis, theory, or principle is laid out for the purpose of thinking through its consequences. History The ancient Greek ''deiknymi'' (), or thought experiment, "was the most anc ...
demonstrating how just the possession of information might in principle have thermodynamic consequences was established in 1929 by
Leó Szilárd Leo Szilard (; hu, Szilárd Leó, pronounced ; born Leó Spitz; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor. He conceived the nuclear chain reaction in 1933, patented the idea of a nuclear ...
, in a refinement of the famous
Maxwell's demon Maxwell's demon is a thought experiment that would hypothetically violate the second law of thermodynamics. It was proposed by the physicist James Clerk Maxwell in 1867. In his first letter Maxwell called the demon a "finite being", while the ' ...
scenario. Consider Maxwell's set-up, but with only a single gas particle in a box. If the supernatural demon knows which half of the box the particle is in (equivalent to a single bit of information), it can close a shutter between the two halves of the box, close a piston unopposed into the empty half of the box, and then extract k_\text T \ln 2 joules of useful work if the shutter is opened again. The particle can then be left to isothermally expand back to its original equilibrium occupied volume. In just the right circumstances therefore, the possession of a single bit of Shannon information (a single bit of
negentropy In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book ''What is Life?'' Later, Léon Brillo ...
in Brillouin's term) really does correspond to a reduction in the entropy of the physical system. The global entropy is not decreased, but information to free energy conversion is possible. Using a
phase-contrast microscope __NOTOC__ Phase-contrast microscopy (PCM) is an optical microscopy technique that converts phase shifts in light passing through a transparent specimen to brightness changes in the image. Phase shifts themselves are invisible, but become visible ...
equipped with a high speed camera connected to a computer, as ''demon'', the principle has been actually demonstrated. In this experiment, information to energy conversion is performed on a Brownian particle by means of ''feedback control''; that is, synchronizing the work given to the particle with the information obtained on its position. Computing energy balances for different feedback protocols, has confirmed that the
Jarzynski equality The Jarzynski equality (JE) is an equation in statistical mechanics that relates free energy differences between two states and the irreversible work along an ensemble of trajectories joining the same states. It is named after the physicist Chris ...
requires a generalization that accounts for the amount of information involved in the feedback.


Landauer's principle

In fact one can generalise: any information that has a physical representation must somehow be embedded in the statistical mechanical degrees of freedom of a physical system. Thus,
Rolf Landauer Rolf William Landauer (February 4, 1927 – April 27, 1999) was a German-American physicist who made important contributions in diverse areas of the thermodynamics of information processing, condensed matter physics, and the conductivity of dis ...
argued in 1961, if one were to imagine starting with those degrees of freedom in a thermalised state, there would be a real reduction in thermodynamic entropy if they were then re-set to a known state. This can only be achieved under information-preserving microscopically deterministic dynamics if the uncertainty is somehow dumped somewhere else – i.e. if the entropy of the environment (or the non information-bearing degrees of freedom) is increased by at least an equivalent amount, as required by the Second Law, by gaining an appropriate quantity of heat: specifically ''kT'' ln 2 of heat for every 1 bit of randomness erased. On the other hand, Landauer argued, there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations – for example, the erasing of a bit to a known state, or the merging of two computation paths – which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy. Applied to the Maxwell's demon/Szilard engine scenario, this suggests that it might be possible to "read" the state of the particle into a computing apparatus with no entropy cost; but ''only'' if the apparatus has already been SET into a known state, rather than being in a thermalised state of uncertainty. To SET (or RESET) the apparatus into this state will cost all the entropy that can be saved by knowing the state of Szilard's particle.


Negentropy

Shannon entropy has been related by physicist
Léon Brillouin Léon Nicolas Brillouin (; August 7, 1889 – October 4, 1969) was a French physicist. He made contributions to quantum mechanics, radio wave propagation in the atmosphere, solid state physics, and information theory. Early life Brillouin ...
to a concept sometimes called
negentropy In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book ''What is Life?'' Later, Léon Brillo ...
. In 1953, Brillouin derived a general equation stating that the changing of an information bit value requires at least ''kT'' ln(2) energy. This is the same energy as the work
Leo Szilard Leo Szilard (; hu, Szilárd Leó, pronounced ; born Leó Spitz; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor. He conceived the nuclear chain reaction in 1933, patented the idea of a nuclear ...
's engine produces in the idealistic case, which in turn equals to the same quantity found by Landauer. In his book, he further explored this problem concluding that any cause of a bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount, ''kT'' ln(2), of energy. Consequently, acquiring information about a system's microstates is associated with an
entropy production Entropy production (or generation) is the amount of entropy which is produced in any irreversible processes such as heat and mass transfer processes including motion of bodies, heat exchange, fluid flow, substances expanding or mixing, anelastic ...
, while erasure yields entropy production only when the bit value is changing. Setting up a bit of information in a sub-system originally in thermal equilibrium results in a local entropy reduction. However, there is no violation of the second law of thermodynamics, according to Brillouin, since a reduction in any local system's thermodynamic entropy results in an increase in thermodynamic entropy elsewhere. In this way, Brillouin clarified the meaning of negentropy which was considered as controversial because its earlier understanding can yield Carnot efficiency higher than one. Additionally, the relationship between energy and information formulated by Brillouin has been proposed as a connection between the amount of bits that the brain processes and the energy it consumes: Collell and Fauquet argued that De Castro analytically found the Landauer limit as the thermodynamic lower bound for brain computations. However, even though evolution is supposed to have "selected" the most energetically efficient processes, the physical lower bounds are not realistic quantities in the brain. Firstly, because the minimum processing unit considered in physics is the atom/molecule, which is distant from the actual way that brain operates; and, secondly, because neural networks incorporate important redundancy and noise factors that greatly reduce their efficiency. Laughlin et al. was the first to provide explicit quantities for the energetic cost of processing sensory information. Their findings in blowflies revealed that for visual sensory data, the cost of transmitting one bit of information is around 5 × 10−14 Joules, or equivalently 104 ATP molecules. Thus, neural processing efficiency is still far from Landauer's limit of kTln(2) J, but as a curious fact, it is still much more efficient than modern computers. In 2009, Mahulikar & Herwig redefined thermodynamic negentropy as the specific entropy deficit of the dynamically ordered sub-system relative to its surroundings. This definition enabled the formulation of the ''Negentropy Principle'', which is mathematically shown to follow from the 2nd Law of Thermodynamics, during order existence.


Quantum theory

Hirschman showed, cf. Hirschman uncertainty, that
Heisenberg's uncertainty principle In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physic ...
can be expressed as a particular lower bound on the sum of the classical distribution entropies of the ''quantum observable'' probability distributions of a quantum mechanical state, the square of the wave-function, in coordinate, and also momentum space, when expressed in Planck units. The resulting inequalities provide a tighter bound on the uncertainty relations of Heisenberg. It is meaningful to assign a "
joint entropy In information theory, joint entropy is a measure of the uncertainty associated with a set of variables. Definition The joint Shannon entropy (in bits) of two discrete random variables X and Y with images \mathcal X and \mathcal Y is defined ...
", because positions and momenta are quantum conjugate variables and are therefore not jointly observable. Mathematically, they have to be treated as
joint distribution Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
. Note that this joint entropy is not equivalent to the
Von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...
, −Tr ''ρ'' ln''ρ'' = −⟨ln''ρ''⟩. Hirschman's entropy is said to account for the ''full information content of a mixture of quantum states''. (Dissatisfaction with the Von Neumann entropy from quantum information points of view has been expressed by Stotland, Pomeransky, Bachmat and Cohen, who have introduced a yet different definition of entropy that reflects the inherent uncertainty of quantum mechanical states. This definition allows distinction between the minimum uncertainty entropy of pure states, and the excess statistical entropy of mixtures.)


See also


References


Further reading

* * . epublication of 1962 original.* * (A highly technical collection of writings giving an overview of the concept of entropy as it appears in various disciplines.) * . * . * * * * *
as PDF


External links


Information Processing and Thermodynamic Entropy
Stanford Encyclopedia of Philosophy. * ''An Intuitive Guide to the Concept of Entropy Arising in Various Sectors of Science'' — a wikibook on the interpretation of the concept of entropy. {{DEFAULTSORT:Entropy In Thermodynamics And Information Theory Thermodynamic entropy Entropy and information Philosophy of thermal and statistical physics