In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, the Rényi entropy is a quantity that generalizes various notions of
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
, including
Hartley entropy,
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
, collision entropy, and
min-entropy
The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the ''mo ...
. The Rényi entropy is named after
Alfréd Rényi
Alfréd Rényi (20 March 1921 – 1 February 1970) was a Hungarian mathematician known for his work in probability theory, though he also made contributions in combinatorics, graph theory, and number theory.
Life
Rényi was born in Budapest to A ...
, who looked for the most general way to quantify information while preserving additivity for independent events.
In the context of
fractal dimension
In mathematics, a fractal dimension is a term invoked in the science of geometry to provide a rational statistical index of complexity detail in a pattern. A fractal pattern changes with the Scaling (geometry), scale at which it is measured.
It ...
estimation, the Rényi entropy forms the basis of the concept of generalized dimensions.
The Rényi entropy is important in ecology and statistics as
index of diversity. The Rényi entropy is also important in
quantum information
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
, where it can be used as a measure of
entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of can be calculated explicitly because it is an
automorphic function with respect to a particular subgroup of the
modular group
In mathematics, the modular group is the projective special linear group \operatorname(2,\mathbb Z) of 2\times 2 matrices with integer coefficients and determinant 1, such that the matrices A and -A are identified. The modular group acts on ...
. In
theoretical computer science
Theoretical computer science is a subfield of computer science and mathematics that focuses on the Abstraction, abstract and mathematical foundations of computation.
It is difficult to circumscribe the theoretical areas precisely. The Associati ...
, the min-entropy is used in the context of
randomness extractor
A randomness extractor, often simply called an "extractor", is a function, which being applied to output from a weak entropy source, together with a short, uniformly random seed, generates a highly random output that appears Independent and identic ...
s.
Definition
The Rényi entropy of order , where
and , is defined as
It is further defined at
as
Here,
is a
discrete random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers ...
with possible outcomes in the set
and corresponding probabilities
for . The resulting
unit of information
A unit of information is any unit of measure of digital data size. In digital computing, a unit of information is used to describe the capacity of a digital data storage device. In telecommunications, a unit of information is used to describe the ...
is determined by the base of the
logarithm
In mathematics, the logarithm of a number is the exponent by which another fixed value, the base, must be raised to produce that number. For example, the logarithm of to base is , because is to the rd power: . More generally, if , the ...
, e.g.
shannon for base 2, or
nat for base
''e''.
If the probabilities are
for all , then all the Rényi entropies of the distribution are equal: .
In general, for all discrete random variables ,
is a non-increasing function in .
Applications often exploit the following relation between the Rényi entropy and the
''α''-norm of the vector of probabilities:
Here, the discrete probability distribution
is interpreted as a vector in
with
and
.
The Rényi entropy for any
is
Schur concave. Proven by the
Schur–Ostrowski criterion.
Special cases
As
approaches zero, the Rényi entropy increasingly weighs all events with nonzero probability more equally, regardless of their probabilities. In the limit for , the Rényi entropy is just the logarithm of the size of the
support of . The limit for
is the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
. As
approaches infinity, the Rényi entropy is increasingly determined by the events of highest probability.
Hartley or max-entropy
is
where
is the number of non-zero probabilities. If the probabilities are all nonzero, it is simply the logarithm of the
cardinality
The thumb is the first digit of the hand, next to the index finger. When a person is standing in the medical anatomical position (where the palm is facing to the front), the thumb is the outermost digit. The Medical Latin English noun for thum ...
of the alphabet () of , sometimes called the
Hartley entropy of ,
Shannon entropy
The limiting value of
as
is the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
:
Collision entropy
Collision entropy, sometimes just called "Rényi entropy", refers to the case ,
where
and
are
independent and identically distributed
Independent or Independents may refer to:
Arts, entertainment, and media Artist groups
* Independents (artist group), a group of modernist painters based in Pennsylvania, United States
* Independentes (English: Independents), a Portuguese artist ...
. The collision entropy is related to the
index of coincidence
In cryptography, coincidence counting is the technique (invented by William F. Friedman) of putting two texts side-by-side and counting the number of times that identical letters appear in the same position in both texts. This count, either as a r ...
. It is the negative logarithm of the
Simpson diversity index.
Min-entropy
In the limit as , the Rényi entropy
converges to the min-entropy :
Equivalently, the min-entropy
is the largest real number such that all events occur with probability at most .
The name ''min-entropy'' stems from the fact that it is the smallest entropy measure in the family of Rényi entropies.
In this sense, it is the strongest way to measure the information content of a discrete random variable.
In particular, the min-entropy is never larger than the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
.
The min-entropy has important applications for
randomness extractor
A randomness extractor, often simply called an "extractor", is a function, which being applied to output from a weak entropy source, together with a short, uniformly random seed, generates a highly random output that appears Independent and identic ...
s in
theoretical computer science
Theoretical computer science is a subfield of computer science and mathematics that focuses on the Abstraction, abstract and mathematical foundations of computation.
It is difficult to circumscribe the theoretical areas precisely. The Associati ...
:
Extractors are able to extract randomness from random sources that have a large min-entropy; merely having a large
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
does not suffice for this task.
Inequalities for different orders ''α''
That
is non-increasing in
for any given distribution of probabilities ,
which can be proven by differentiation,
as
which is proportional to
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
(which is always non-negative), where
. In particular, it is strictly positive except when the distribution is uniform.
At the
limit, we have
.
In particular cases inequalities can be proven also by
Jensen's inequality
In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier p ...
:
For values of , inequalities in the other direction also hold. In particular, we have
On the other hand, the Shannon entropy
can be arbitrarily high for a random variable
that has a given min-entropy. An example of this is given by the sequence of random variables
for
such that
and
since
but .
Rényi divergence
As well as the absolute Rényi entropies, Rényi also defined a spectrum of divergence measures generalising the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
.
The Rényi divergence of order or alpha-divergence of a distribution from a distribution is defined to be
when and . We can define the Rényi divergence for the special values by taking a limit, and in particular the limit gives the Kullback–Leibler divergence.
Some special cases:
* : minus the
log probability
In probability theory and computer science, a log probability is simply a logarithm of a probability. The use of log probabilities means representing probabilities on a logarithmic scale (-\infty, 0], instead of the standard , 1/math> unit interva ...
under that ;
* : minus twice the logarithm of the
Bhattacharyya coefficient; ()
* : the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
;
* : the log of the expected ratio of the probabilities;
* : the log of the maximum ratio of the probabilities.
The Rényi divergence is indeed a
divergence
In vector calculus, divergence is a vector operator that operates on a vector field, producing a scalar field giving the rate that the vector field alters the volume in an infinitesimal neighborhood of each point. (In 2D this "volume" refers to ...
, meaning simply that
is greater than or equal to zero, and zero only when . For any fixed distributions and , the Rényi divergence is nondecreasing as a function of its order , and it is continuous on the set of for which it is finite,
or for the sake of brevity, the information of order obtained if the distribution is replaced by the distribution .
Financial interpretation
A pair of probability distributions can be viewed as a game of chance in which one of the distributions defines official odds and the other contains the actual probabilities. Knowledge of the actual probabilities allows a player to profit from the game. The expected profit rate is connected to the Rényi divergence as follows
where
is the distribution defining the official odds (i.e. the "market") for the game,
is the investor-believed distribution and
is the investor's risk aversion (the
Arrow–Pratt relative risk aversion).
If the true distribution is
(not necessarily coinciding with the investor's belief ), the long-term realized rate converges to the true expectation which has a similar mathematical structure
Properties specific to ''α'' = 1
The value , which gives the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
and the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
, is the only value at which the
chain rule of conditional probability holds exactly: