In
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, the conditional entropy quantifies the amount of information needed to describe the outcome of a
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
given that the value of another random variable
is known. Here, information is measured in
shannons,
nats, or
hartleys. The ''entropy of
conditioned on
'' is written as
.
Definition
The conditional entropy of
given
is defined as
:
where
and
denote the
support sets of
and
.
''Note:'' Here, the convention is that the expression
should be treated as being equal to zero. This is because
.
Intuitively, notice that by definition of
expected value
In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
and of
conditional probability
In probability theory, conditional probability is a measure of the probability of an Event (probability theory), event occurring, given that another event (by assumption, presumption, assertion or evidence) is already known to have occurred. This ...
,
can be written as