Central Moment
   HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, a central moment is a moment of a
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
of a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
about the random variable's
mean A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
; that is, it is the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterized. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its
location In geography, location or place is used to denote a region (point, line, or area) on Earth's surface. The term ''location'' generally implies a higher degree of certainty than ''place'', the latter often indicating an entity with an ambiguous bou ...
. Sets of central moments can be defined for both univariate and multivariate distributions.


Univariate moments

The -th moment about the
mean A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
(or -th central moment) of a real-valued
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
is the quantity , where E is the expectation operator. For a continuous
univariate In mathematics, a univariate object is an expression (mathematics), expression, equation, function (mathematics), function or polynomial involving only one Variable (mathematics), variable. Objects involving more than one variable are ''wikt:multi ...
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
with
probability density function In probability theory, a probability density function (PDF), density function, or density of an absolutely continuous random variable, is a Function (mathematics), function whose value at any given sample (or point) in the sample space (the s ...
, the -th moment about the mean is \mu_n = \operatorname \left ^n \right= \int_^ (x - \mu)^n f(x)\,\mathrm x. For random variables that have no mean, such as the
Cauchy distribution The Cauchy distribution, named after Augustin-Louis Cauchy, is a continuous probability distribution. It is also known, especially among physicists, as the Lorentz distribution (after Hendrik Lorentz), Cauchy–Lorentz distribution, Lorentz(ian) ...
, central moments are not defined. The first few central moments have intuitive interpretations: * The "zeroth" central moment is 1. * The first central moment is 0 (not to be confused with the first raw moment or the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
). * The second central moment is called the
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
, and is usually denoted , where represents the
standard deviation In statistics, the standard deviation is a measure of the amount of variation of the values of a variable about its Expected value, mean. A low standard Deviation (statistics), deviation indicates that the values tend to be close to the mean ( ...
. * The third and fourth central moments are used to define the standardized moments which are used to define skewness and
kurtosis In probability theory and statistics, kurtosis (from , ''kyrtos'' or ''kurtos'', meaning "curved, arching") refers to the degree of “tailedness” in the probability distribution of a real-valued random variable. Similar to skewness, kurtos ...
, respectively.


Properties

For all , the -th central moment is
homogeneous Homogeneity and heterogeneity are concepts relating to the uniformity of a substance, process or image. A homogeneous feature is uniform in composition or character (i.e., color, shape, size, weight, height, distribution, texture, language, i ...
of degree : \mu_n(cX) = c^n \mu_n(X).\, ''Only'' for such that n equals 1, 2, or 3 do we have an additivity property for random variables and that are
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in Pennsylvania, United States * Independentes (English: Independents), a Portuguese artist ...
: \mu_n(X+Y) = \mu_n(X)+\mu_n(Y)\, provided ''n'' ∈ . A related functional that shares the translation-invariance and homogeneity properties with the -th central moment, but continues to have this additivity property even when is the -th
cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have ...
. For , the -th cumulant is just the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
; for  = either 2 or 3, the -th cumulant is just the -th central moment; for , the -th cumulant is an -th-degree monic polynomial in the first moments (about zero), and is also a (simpler) -th-degree polynomial in the first central moments.


Relation to moments about the origin

Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the -th-order moment about the origin to the moment about the mean is \mu_n = \operatorname\left left(X - \operatorname[Xright)^n\right">.html" ;"title="left(X - \operatorname[X">left(X - \operatorname[Xright)^n\right= \sum_^n \binom ^ \mu'_j \mu^, where is the mean of the distribution, and the moment about the origin is given by \mu'_m = \int_^ x^m f(x)\,dx = \operatorname[X^m] = \sum_^m \binom \mu_j \mu^. For the cases — which are of most interest because of the relations to
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
, skewness, and
kurtosis In probability theory and statistics, kurtosis (from , ''kyrtos'' or ''kurtos'', meaning "curved, arching") refers to the degree of “tailedness” in the probability distribution of a real-valued random variable. Similar to skewness, kurtos ...
, respectively — this formula becomes (noting that \mu = \mu'_1 and \mu'_0=1): \mu_2 = \mu'_2 - \mu^2\, which is commonly referred to as \operatorname(X) = \operatorname ^2- \left(\operatorname right)^2 \begin \mu_3 &= \mu'_3 - 3 \mu \mu'_2 +2 \mu^3 \\ \mu_4 &= \mu'_4 - 4 \mu \mu'_3 + 6 \mu^2 \mu'_2 - 3 \mu^4. \end ... and so on, following
Pascal's triangle In mathematics, Pascal's triangle is an infinite triangular array of the binomial coefficients which play a crucial role in probability theory, combinatorics, and algebra. In much of the Western world, it is named after the French mathematician Bla ...
, i.e. \mu_5 = \mu'_5 - 5 \mu \mu'_4 + 10 \mu^2 \mu'_3 - 10 \mu^3 \mu'_2 + 4 \mu^5.\, because The following sum is a stochastic variable having a ''compound distribution'' W = \sum_^M Y_i, where the Y_i are mutually independent random variables sharing the same common distribution and M a random integer variable independent of the Y_k with its own distribution. The moments of W are obtained as \operatorname ^n \sum_^n\operatorname\left binom\right\sum_^i \binom ^ \operatorname \left \left(\sum_^j Y_k\right)^n \right where \operatorname \left ^n\right is defined as zero for j = 0.


Symmetric distributions

In distributions that are symmetric about their means (unaffected by being reflected about the mean), all odd central moments equal zero whenever they exist, because in the formula for the -th moment, each term involving a value of less than the mean by a certain amount exactly cancels out the term involving a value of greater than the mean by the same amount.


Multivariate moments

For a continuous bivariate
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
with
probability density function In probability theory, a probability density function (PDF), density function, or density of an absolutely continuous random variable, is a Function (mathematics), function whose value at any given sample (or point) in the sample space (the s ...
the moment about the mean is \begin \mu_ &= \operatorname \left ^j ^k \right\\ pt&= \int_^ \int_^ ^j ^k f(x,y) \, dx \, dy. \end


Central moment of complex random variables

The -th central moment for a complex random variable is defined as The absolute -th central moment of is defined as The 2nd-order central moment is called the ''variance'' of whereas the 2nd-order central moment is the ''pseudo-variance'' of .


See also

* Standardized moment * Image moment * *
Complex random variable In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers. Complex random variables can al ...


References

{{DEFAULTSORT:Central Moment Statistical deviation and dispersion Moments (mathematics) fr:Moment (mathématiques)#Moment centré