In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
and
statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, negentropy is used as a measure of distance to normality. It is also known as negative entropy or syntropy.
Etymology
The concept and phrase "''negative entropy''" was introduced by
Erwin Schrödinger
Erwin Rudolf Josef Alexander Schrödinger ( ; ; 12 August 1887 – 4 January 1961), sometimes written as or , was an Austrian-Irish theoretical physicist who developed fundamental results in quantum field theory, quantum theory. In particul ...
in his 1944 book ''
What is Life?.'' Later,
French physicist
A physicist is a scientist who specializes in the field of physics, which encompasses the interactions of matter and energy at all length and time scales in the physical universe. Physicists generally are interested in the root or ultimate cau ...
Léon Brillouin shortened the phrase to (). In 1974,
Albert Szent-Györgyi
Albert Imre Szent-Györgyi de Rapoltu Mare, Nagyrápolt (; September 16, 1893 – October 22, 1986) was a Hungarian biochemist who won the Nobel Prize in Physiology or Medicine in 1937. He is credited with first isolating vitamin C and disc ...
proposed replacing the term ''negentropy'' with ''syntropy''. That term may have originated in the 1940s with the Italian mathematician
Luigi Fantappiè
Luigi Fantappiè (15 September 1901 – 28 July 1956) was an Italian mathematician, known for work in mathematical analysis and for creating the theory of analytic functionals: he was a student and follower of Vito Volterra. Later in life, he p ...
, who tried to construct a unified theory of
biology
Biology is the scientific study of life and living organisms. It is a broad natural science that encompasses a wide range of fields and unifying principles that explain the structure, function, growth, History of life, origin, evolution, and ...
and
physics
Physics is the scientific study of matter, its Elementary particle, fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge whi ...
.
Buckminster Fuller tried to popularize this usage, but ''negentropy'' remains common.
In a note to ''What is Life?,'' Schrödinger explained his use of this phrase:
Information theory
In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
and
statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, negentropy is used as a measure of distance to normality. Out of all
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
s with a given
mean
A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
and
variance
In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
, the Gaussian or
normal distribution
In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is
f(x) = \frac ...
is the one with the highest
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes
if and only if
In logic and related fields such as mathematics and philosophy, "if and only if" (often shortened as "iff") is paraphrased by the biconditional, a logical connective between statements. The biconditional is true in two cases, where either bo ...
the signal is Gaussian.
Negentropy is defined as
:
where
is the
differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy (a measure of average surprisal) of a random variable, to continu ...
of a normal distribution
with the same mean
and variance
as
, and
is the differential entropy of
, with
as its
probability density function
In probability theory, a probability density function (PDF), density function, or density of an absolutely continuous random variable, is a Function (mathematics), function whose value at any given sample (or point) in the sample space (the s ...
:
:
Negentropy is used in
statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
and
signal processing
Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, Scalar potential, potential fields, Seismic tomograph ...
. It is related to
network entropy, which is used in
independent component analysis
In signal processing, independent component analysis (ICA) is a computational method for separating a multivariate statistics, multivariate signal into additive subcomponents. This is done by assuming that at most one subcomponent is Gaussian and ...
.
The negentropy of a distribution is equal to the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
between
and a Gaussian distribution with the same mean and variance as
(see ' for a proof):
In particular, it is always nonnegative (unlike differential entropy, which can be negative).
Correlation between statistical negentropy and Gibbs' free energy

There is a physical quantity closely linked to
free energy (
free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873,
Willard Gibbs
Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American mechanical engineer and scientist who made fundamental theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynami ...
created a diagram illustrating the concept of free energy corresponding to
free enthalpy. On the diagram one can see the quantity called
capacity for entropy. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by
Massieu for the
isothermal process
An isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a change in the sy ...
(both quantities differs just with a figure sign) and by then
Planck for the
isothermal
An isothermal process is a type of thermodynamic process in which the temperature ''T'' of a system remains constant: Δ''T'' = 0. This typically occurs when a system is in contact with an outside thermal reservoir, and a change in the sys ...
-
isobaric process. More recently, the Massieu–Planck
thermodynamic potential
Thermodynamics is a branch of physics that deals with heat, Work (thermodynamics), work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed b ...
, known also as ''
free entropy
Free may refer to:
Concept
* Freedom, the ability to act or change without constraint or restriction
* Emancipate, attaining civil and political rights or equality
* Free (''gratis''), free of charge
* Gratis versus libre, the difference bet ...
'', has been shown to play a great role in the so-called entropic formulation of
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
, applied among the others in molecular biology and thermodynamic non-equilibrium processes.
::
::where:
::
is
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
::
is negentropy (Gibbs "capacity for entropy")
::
is the
Massieu potential
::
is the
partition function
::
the
Boltzmann constant
The Boltzmann constant ( or ) is the proportionality factor that relates the average relative thermal energy of particles in a ideal gas, gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the ...
In particular, mathematically the negentropy (the negative entropy function, in physics interpreted as free entropy) is the
convex conjugate
In mathematics and mathematical optimization, the convex conjugate of a function is a generalization of the Legendre transformation which applies to non-convex functions. It is also known as Legendre–Fenchel transformation, Fenchel transformati ...
of
LogSumExp (in physics interpreted as the free energy).
Brillouin's negentropy principle of information
In 1953,
Léon Brillouin derived a general equation stating that the changing of an information bit value requires at least
energy. This is the same energy as the work
Leó Szilárd's engine produces in the idealistic case. In his book,
[Leon Brillouin, ''Science and Information theory'', Dover, 1956] he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
See also
*
Exergy
Exergy, often referred to as "available energy" or "useful work potential", is a fundamental concept in the field of thermodynamics and engineering. It plays a crucial role in understanding and quantifying the quality of energy within a system and ...
*
Free entropy
Free may refer to:
Concept
* Freedom, the ability to act or change without constraint or restriction
* Emancipate, attaining civil and political rights or equality
* Free (''gratis''), free of charge
* Gratis versus libre, the difference bet ...
*
Entropy in thermodynamics and information theory
References
{{Wiktionary
Entropy and information
Statistical deviation and dispersion
Thermodynamic entropy