In
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, the cross-entropy between two
probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
s
and
over the same underlying set of events measures the average number of
bit
The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented a ...
s needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution
, rather than the true distribution
.
Definition
The cross-entropy of the distribution
relative to a distribution
over a given set is defined as follows:
: