Gauss's Inequality
   HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a
unimodal In mathematics, unimodality means possessing a unique mode. More generally, unimodality means there is only a single highest value, somehow defined, of some mathematical object. Unimodal probability distribution In statistics, a unimodal p ...
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
lies more than any given distance from its
mode Mode ( meaning "manner, tune, measure, due measure, rhythm, melody") may refer to: Arts and entertainment * MO''D''E (magazine), a defunct U.S. women's fashion magazine * ''Mode'' magazine, a fictional fashion magazine which is the setting fo ...
. Let ''X'' be a unimodal random variable with mode ''m'', and let ''τ'' 2 be the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
of (''X'' − ''m'')2. (''τ'' 2 can also be expressed as (''μ'' − ''m'')2 + ''σ'' 2, where ''μ'' and ''σ'' are the mean and
standard deviation In statistics, the standard deviation is a measure of the amount of variation of the values of a variable about its Expected value, mean. A low standard Deviation (statistics), deviation indicates that the values tend to be close to the mean ( ...
of ''X''.) Then for any positive value of ''k'', : \Pr(, X - m, > k) \leq \begin \left( \frac \right)^2 & \text k \geq \frac \\ pt1 - \frac & \text 0 \leq k \leq \frac. \end The theorem was first proved by
Carl Friedrich Gauss Johann Carl Friedrich Gauss (; ; ; 30 April 177723 February 1855) was a German mathematician, astronomer, geodesist, and physicist, who contributed to many fields in mathematics and science. He was director of the Göttingen Observatory and ...
in 1823.


Extensions to higher-order moments

Winkler in 1866 extended Gauss's inequality to ''r''th moments Winkler A. (1886) Math-Natur theorie Kl. Akad. Wiss Wien Zweite Abt 53, 6–41 where ''r'' > 0 and the distribution is unimodal with a mode of zero. This is sometimes called Camp–Meidell's inequality. : P( , X , \ge k ) \le \left( \frac \right)^r \frac \quad \text \quad k^r \ge \frac \operatorname( , X , ^r ), : P( , X , \ge k) \le \left( 1 - \left \frac \right \right) \quad \text \quad k^r \le \frac \operatorname( , X , ^r ). Gauss's bound has been subsequently sharpened and extended to apply to departures from the mean rather than the mode due to the Vysochanskiï–Petunin inequality. The latter has been extended by Dharmadhikari and Joag-Dev : P( , X , > k ) \le \max\left( \left \frac r \rightr E, X^r , , \frac s E, X^r , - \frac 1 \right) where ''s'' is a constant satisfying both ''s'' > ''r'' + 1 and ''s''(''s'' − ''r'' − 1) = ''r''''r'' and ''r'' > 0. It can be shown that these inequalities are the best possible and that further sharpening of the bounds requires that additional restrictions be placed on the distributions.


See also

* Vysochanskiï–Petunin inequality, a similar result for the distance from the mean rather than the mode *
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) provides an upper bound on the probability of deviation of a random variable (with finite variance) from its mean. More specifically, the probability ...
, concerns distance from the mean without requiring unimodality *
Concentration inequality In chemistry, concentration is the abundance of a constituent divided by the total volume of a mixture. Several types of mathematical description can be distinguished: '' mass concentration'', ''molar concentration'', ''number concentration'', an ...
– a summary of tail-bounds on random variables.


References

* * * *{{Cite journal , doi = 10.2307/2684253 , title = The Three Sigma Rule , year = 1994 , author = Pukelsheim, F. , journal = American Statistician , volume = 48 , issue = 2 , pages = 88–91 , publisher = American Statistical Association , jstor = 2684253 Probabilistic inequalities