In
probability theory, the Chernoff bound gives exponentially decreasing bounds on
tail distributions of sums of independent random variables. Despite being named after
Herman Chernoff, the author of the paper it first appeared in, the result is due to Herman Rubin. It is a sharper bound than the first- or second-moment-based tail bounds such as
Markov's inequality or
Chebyshev's inequality, which only yield power-law bounds on tail decay. However, the Chernoff bound requires the variates to be independent, a condition that is not required by either Markov's inequality or Chebyshev's inequality (although Chebyshev's inequality does require the variates to be pairwise independent).
The Chernoff bound is related to the
Bernstein inequalities, which were developed earlier, and to
Hoeffding's inequality.
The generic bound
The generic Chernoff bound for a random variable is attained by applying
Markov's inequality to . This gives a bound in terms of the
moment-generating function of . For every
:
:
Since this bound holds for every positive
, we have:
:
The Chernoff bound sometimes refers to the above inequality,
which was first applied by
Sergei Bernstein to prove the related
Bernstein inequalities. It is also used to prove
Hoeffding's inequality,
Bennett's inequality In probability theory, Bennett's inequality (mathematics), inequality provides an upper bound on the probability that the sum of independent random variables deviates from its expected value by more than any specified amount. Bennett's inequality wa ...
, and
McDiarmid's inequality
In probability theory and theoretical computer science, McDiarmid's inequality is a concentration inequality which bounds the deviation between the sampled value and the expected value of certain functions when they are evaluated on independent ran ...
.
This inequality can be applied generally to various classes of distributions, including
sub-gaussian distributions, sub-
gamma distributions, and sums of independent random variables.
Chernoff bounds commonly refer to the case where
is the sum of independent
Bernoulli random variables.
When is the sum of independent random variables , the moment generating function of is the product of the individual moment generating functions, giving that
By performing the same analysis on the random variable , one can get the same bound in the other direction.
: