In
mathematics, an information source is a sequence of
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
s ranging over a
finite alphabet Γ, having a
stationary distribution Stationary distribution may refer to:
* A special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assum ...
.
The uncertainty, or
entropy rate, of an information source is defined as
:
where
:
is the sequence of random variables defining the information source, and
:
is the conditional
information entropy
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
of the sequence of random variables. Equivalently, one has
:
See also
*
Markov information source
In mathematics, a Markov information source, or simply, a Markov source, is an information source whose underlying dynamics are given by a stationary finite Markov chain.
Formal definition
An information source is a sequence of random variables ...
*
Asymptotic equipartition property
References
* Robert B. Ash, ''Information Theory'', (1965) Dover Publications.
zh-yue:資訊源
Information theory
Stochastic processes
{{statistics-stub