Negentropy
   HOME

TheInfoList



OR:

In
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
and
statistics Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, indust ...
, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by
Erwin Schrödinger Erwin Rudolf Josef Alexander Schrödinger (, ; ; 12 August 1887 – 4 January 1961), sometimes written as or , was a Nobel Prize-winning Austrian physicist with Irish citizenship who developed a number of fundamental results in quantum theo ...
in his 1944 popular-science book ''
What is Life? ''What Is Life? The Physical Aspect of the Living Cell'' is a 1944 science book written for the lay reader by physicist Erwin Schrödinger. The book was based on a course of public lectures delivered by Schrödinger in February 1943, under the ...
'' Later, Léon Brillouin shortened the phrase to ''negentropy''. In 1974, Albert Szent-Györgyi proposed replacing the term ''negentropy'' with ''syntropy''. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of
biology Biology is the scientific study of life. It is a natural science with a broad scope but has several unifying themes that tie it together as a single, coherent field. For instance, all organisms are made up of cells that process hereditary ...
and
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which ...
.
Buckminster Fuller Richard Buckminster Fuller (; July 12, 1895 – July 1, 1983) was an American architect, systems theorist, writer, designer, inventor, philosopher, and futurist. He styled his name as R. Buckminster Fuller in his writings, publishing ...
tried to popularize this usage, but ''negentropy'' remains common. In a note to ''
What is Life? ''What Is Life? The Physical Aspect of the Living Cell'' is a 1944 science book written for the lay reader by physicist Erwin Schrödinger. The book was based on a course of public lectures delivered by Schrödinger in February 1943, under the ...
'' Schrödinger explained his use of this phrase.


Information theory

In
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
and
statistics Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, indust ...
, negentropy is used as a measure of distance to normality. Out of all distributions with a given mean and variance, the normal or
Gaussian distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes
if and only if In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false. The connective is bic ...
the signal is Gaussian. Negentropy is defined as :J(p_x) = S(\varphi_x) - S(p_x)\, where S(\varphi_x) is the
differential entropy Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuo ...
of the Gaussian density with the same
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value ( magnitude and sign) of a given data set. For a data set, the '' ar ...
and
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
as p_x and S(p_x) is the differential entropy of p_x: :S(p_x) = - \int p_x(u) \log p_x(u) \, du Negentropy is used in
statistics Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, indust ...
and
signal processing Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing '' signals'', such as sound, images, and scientific measurements. Signal processing techniques are used to optimize transmissions, ...
. It is related to network
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
, which is used in
independent component analysis In signal processing, independent component analysis (ICA) is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that at most one subcomponent is Gaussian and that the subcomponents ar ...
. The negentropy of a distribution is equal to the
Kullback–Leibler divergence In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fr ...
between p_x and a Gaussian distribution with the same mean and variance as p_x (see ' for a proof). In particular, it is always nonnegative.


Correlation between statistical negentropy and Gibbs' free energy

There is a physical quantity closely linked to free energy ( free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873,
Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in ...
created a diagram illustrating the concept of free energy corresponding to free enthalpy. On the diagram one can see the quantity called capacity for entropy. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by Massieu for the
isothermal process In thermodynamics, an isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a ...
(both quantities differs just with a figure sign) and then Planck for the
isothermal In thermodynamics, an isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, an ...
- isobaric process. More recently, the Massieu–Planck
thermodynamic potential A thermodynamic potential (or more accurately, a thermodynamic potential energy)ISO/IEC 80000-5, Quantities an units, Part 5 - Thermodynamics, item 5-20.4 Helmholtz energy, Helmholtz functionISO/IEC 80000-5, Quantities an units, Part 5 - Thermod ...
, known also as '' free entropy'', has been shown to play a great role in the so-called entropic formulation of
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
, applied among the others in molecular biology and thermodynamic non-equilibrium processes. :: J = S_\max - S = -\Phi = -k \ln Z\, ::where: ::S is
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
::J is negentropy (Gibbs "capacity for entropy") ::\Phi is the Massieu potential ::Z is the partition function ::k the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constan ...
In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the
convex conjugate In mathematics and mathematical optimization, the convex conjugate of a function is a generalization of the Legendre transformation which applies to non-convex functions. It is also known as Legendre–Fenchel transformation, Fenchel transformatio ...
of
LogSumExp The LogSumExp (LSE) (also called RealSoftMax or multivariable softplus) function is a smooth maximum – a smooth approximation to the maximum function, mainly used by machine learning algorithms. It is defined as the logarithm of the sum of t ...
(in physics interpreted as the free energy).


Brillouin's negentropy principle of information

In 1953, Léon Brillouin derived a general equation stating that the changing of an information bit value requires at least kT\ln 2 energy. This is the same energy as the work
Leó Szilárd Leo Szilard (; hu, Szilárd Leó, pronounced ; born Leó Spitz; February 11, 1898 – May 30, 1964) was a Hungarian-German-American physicist and inventor. He conceived the nuclear chain reaction in 1933, patented the idea of a nuclear ...
's engine produces in the idealistic case. In his book,Leon Brillouin, ''Science and Information theory'', Dover, 1956 he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.


See also

*
Exergy In thermodynamics, the exergy of a system is the maximum useful work possible during a process that brings the system into equilibrium with a heat reservoir, reaching maximum entropy. When the surroundings are the reservoir, exergy is the pot ...
* Free entropy * Entropy in thermodynamics and information theory


Notes

{{Wiktionary Entropy and information Statistical deviation and dispersion Thermodynamic entropy