In
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, the limiting density of discrete points is an adjustment to the formula of
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".
As a 21-year-old master's degree student at the Massachusetts In ...
for
differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continu ...
.
It was formulated by
Edwin Thompson Jaynes
Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis. He wrote extensively on statistical mechanics and on foundations of probability and statis ...
to address defects in the initial definition of differential entropy.
Definition
Shannon originally wrote down the following formula for the
entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
of a continuous distribution, known as
differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continu ...
:
:
Unlike Shannon's formula for the discrete entropy, however, this is not the result of any derivation (Shannon simply replaced the summation symbol in the discrete version with an integral), and it lacks many of the properties that make the discrete entropy a useful measure of uncertainty. In particular, it is not invariant under a
change of variables
Change or Changing may refer to:
Alteration
* Impermanence, a difference in a state of affairs at different points in time
* Menopause, also referred to as "the change", the permanent cessation of the menstrual period
* Metamorphosis, or chang ...
and can become negative. In addition, it is not even dimensionally correct. Since
would be dimensionless,
must have units of
, which means that the argument to the logarithm is not dimensionless as required.
Jaynes argued that the formula for the continuous entropy should be derived by taking the limit of increasingly dense discrete distributions.
Suppose that we have a set of
discrete points
, such that in the limit
their density approaches a function
called the "invariant measure".
: