In
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a
unimodal
In mathematics, unimodality means possessing a unique mode. More generally, unimodality means there is only a single highest value, somehow defined, of some mathematical object.
Unimodal probability distribution
In statistics, a unimodal pr ...
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
lies more than any given distance from its
mode.
Let ''X'' be a unimodal random variable with mode ''m'', and let ''τ''
2 be the
expected value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a ...
of (''X'' − ''m'')
2. (''τ''
2 can also be expressed as (''μ'' − ''m'')
2 + ''σ''
2, where ''μ'' and ''σ'' are the mean and
standard deviation of ''X''.) Then for any positive value of ''k'',
:
The theorem was first proved by
Carl Friedrich Gauss
Johann Carl Friedrich Gauss (; german: Gauß ; la, Carolus Fridericus Gauss; 30 April 177723 February 1855) was a German mathematician and physicist who made significant contributions to many fields in mathematics and science. Sometimes refe ...
in 1823.
Extensions to higher-order moments
Winkler in 1866 extended
Gauss' inequality
In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode.
Let ''X'' be a unimodal random variable with mode ''m'', a ...
to ''r''
th moments
[Winkler A. (1886) Math-Natur theorie Kl. Akad. Wiss Wien Zweite Abt 53, 6–41] where ''r'' > 0 and the distribution is unimodal with a mode of zero. This is sometimes called Camp–Meidell's inequality.
:
:
Gauss' bound has been subsequently sharpened and extended to apply to departures from the mean rather than the mode due to the
Vysochanskiï–Petunin inequality. The latter has been extended by Dharmadhikari and Joag-Dev
:
where ''s'' is a constant satisfying both ''s'' > ''r'' + 1 and ''s''(''s'' − ''r'' − 1) = ''r''
''r'' and ''r'' > 0.
It can be shown that these inequalities are the best possible and that further sharpening of the bounds requires that additional restrictions be placed on the distributions.
See also
*
Vysochanskiï–Petunin inequality, a similar result for the distance from the mean rather than the mode
*
Chebyshev's inequality
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, concerns distance from the mean without requiring unimodality
*
Concentration inequality – a summary of tail-bounds on random variables.
References
*
*
*
*{{Cite journal
, doi = 10.2307/2684253
, title = The Three Sigma Rule
, year = 1994
, author = Pukelsheim, F.
, journal = American Statistician
, volume = 48
, issue = 2
, pages = 88–91
, publisher = American Statistical Association
, jstor = 2684253
Probabilistic inequalities