In
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, the Rényi entropy is a quantity that generalizes various notions of entropy, including
Hartley entropy The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function
: H_0(A) : ...
,
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
, collision entropy, and
min-entropy. The Rényi entropy is named after
Alfréd Rényi
Alfréd Rényi (20 March 1921 – 1 February 1970) was a Hungarian mathematician known for his work in probability theory, though he also made contributions in combinatorics, graph theory, and number theory.
Life
Rényi was born in Budapest ...
, who looked for the most general way to quantify information while preserving additivity for independent events.
In the context of
fractal dimension
In mathematics, more specifically in fractal geometry, a fractal dimension is a ratio providing a statistical index of complexity comparing how detail in a pattern (strictly speaking, a fractal pattern) changes with the scale at which it is meas ...
estimation, the Rényi entropy forms the basis of the concept of generalized dimensions.
The Rényi entropy is important in ecology and statistics as
index of diversity. The Rényi entropy is also important in
quantum information
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
, where it can be used as a measure of
entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of can be calculated explicitly because it is an
automorphic function with respect to a particular subgroup of the
modular group
In mathematics, the modular group is the projective special linear group of matrices with integer coefficients and determinant 1. The matrices and are identified. The modular group acts on the upper-half of the complex plane by fraction ...
. In
theoretical computer science
Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory.
It is difficult to circumsc ...
, the min-entropy is used in the context of
randomness extractors.
Definition
The Rényi entropy of order
, where
and
, is defined as
:
.
Here,
is a discrete random variable with possible outcomes in the set
and corresponding probabilities
for
. The
logarithm
In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a number to the base is the exponent to which must be raised, to produce . For example, since , the ''logarithm base'' 10 of ...
is conventionally taken to be base 2, especially in the context of
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
where
bit
The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented a ...
s are used.
If the probabilities are
for all
, then all the Rényi entropies of the distribution are equal:
.
In general, for all discrete random variables
,
is a non-increasing function in
.
Applications often exploit the following relation between the Rényi entropy and the
''p''-norm of the vector of probabilities:
:
.
Here, the discrete probability distribution
is interpreted as a vector in
with
and
.
The Rényi entropy for any
is
Schur concave.
Special cases
As approaches zero, the Rényi entropy increasingly weighs all events with nonzero probability more equally, regardless of their probabilities. In the limit for → 0, the Rényi entropy is just the logarithm of the size of the support of . The limit for → 1 is the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
. As approaches infinity, the Rényi entropy is increasingly determined by the events of highest probability.
Hartley or max-entropy
Provided the probabilities are nonzero,
is the logarithm of the
cardinality
In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...
of the alphabet (
) of
, sometimes called the
Hartley entropy The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set ''A'' uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function
: H_0(A) : ...
of
,
:
Shannon entropy
The limiting value of
as → 1 is the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
:
:
Collision entropy
Collision entropy, sometimes just called "Rényi entropy", refers to the case = 2,
:
where and are
independent and identically distributed
In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usua ...
. The collision entropy is related to the
index of coincidence
In cryptography, coincidence counting is the technique (invented by William F. Friedman) of putting two texts side-by-side and counting the number of times that identical letters appear in the same position in both texts. This count, either as a ...
.
Min-entropy
In the limit as
, the Rényi entropy
converges to the min-entropy
:
:
Equivalently, the min-entropy
is the largest real number such that all events occur with probability at most
.
The name ''min-entropy'' stems from the fact that it is the smallest entropy measure in the family of Rényi entropies.
In this sense, it is the strongest way to measure the information content of a discrete random variable.
In particular, the min-entropy is never larger than the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
.
The min-entropy has important applications for
randomness extractors in
theoretical computer science
Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory.
It is difficult to circumsc ...
:
Extractors are able to extract randomness from random sources that have a large min-entropy; merely having a large
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
does not suffice for this task.
Inequalities for different orders ''α''
That
is non-increasing in
for any given distribution of probabilities
,
which can be proven by differentiation,
as
:
which is proportional to
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
(which is always non-negative), where
.
In particular cases inequalities can be proven also by
Jensen's inequality
In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier ...
:
:
For values of
, inequalities in the other direction also hold. In particular, we have
:
On the other hand, the Shannon entropy
can be arbitrarily high for a random variable
that has a given min-entropy. An example of this is given by the sequence of random variables
for
such that
and
since
but
.
Rényi divergence
As well as the absolute Rényi entropies, Rényi also defined a spectrum of divergence measures generalising the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
.
The Rényi divergence of order or alpha-divergence of a distribution from a distribution is defined to be
:
when and . We can define the Rényi divergence for the special values by taking a limit, and in particular the limit gives the Kullback–Leibler divergence.
Some special cases:
:
: minus the log probability under that ;
:
: minus twice the logarithm of the
Bhattacharyya coefficient In statistics, the Bhattacharyya distance measures the similarity of two probability distributions. It is closely related to the Bhattacharyya coefficient which is a measure of the amount of overlap between two statistical samples or populations.
...
; ()
:
: the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
;
:
: the log of the expected ratio of the probabilities;
:
: the log of the maximum ratio of the probabilities.
The Rényi divergence is indeed a
divergence
In vector calculus, divergence is a vector operator that operates on a vector field, producing a scalar field giving the quantity of the vector field's source at each point. More technically, the divergence represents the volume density of t ...
, meaning simply that
is greater than or equal to zero, and zero only when . For any fixed distributions and , the Rényi divergence is nondecreasing as a function of its order , and it is continuous on the set of for which it is finite,
or for the sake of brevity, the information of order obtained if the distribution is replaced by the distribution .
Financial interpretation
A pair of probability distributions can be viewed as a game of chance in which one of the distributions defines official odds and the other contains the actual probabilities. Knowledge of the actual probabilities allows a player to profit from the game. The expected profit rate is connected to the Rényi divergence as follows
:
where
is the distribution defining the official odds (i.e. the "market") for the game,
is the investor-believed distribution and
is the investor's risk aversion (the Arrow-Pratt relative risk aversion).
If the true distribution is
(not necessarily coinciding with the investor's belief
), the long-term realized rate converges to the true expectation which has a similar mathematical structure
:
Why ''α'' = 1 is special
The value , which gives the
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum Will ...
and the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
, is special because it is only at that the
chain rule of conditional probability holds exactly:
: