The multivariate stable distribution is a multivariate
probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
that is a multivariate generalisation of the univariate
stable distribution
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be sta ...
. The multivariate stable distribution defines linear relations between
stable distribution
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be sta ...
marginals. In the same way as for the univariate case, the distribution is defined in terms of its
characteristic function In mathematics, the term "characteristic function" can refer to any of several distinct concepts:
* The indicator function of a subset, that is the function
::\mathbf_A\colon X \to \,
:which for a given subset ''A'' of ''X'', has value 1 at point ...
.
The multivariate stable distribution can also be thought as an extension of the
multivariate normal distribution
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
. It has parameter, ''α'', which is defined over the range 0 < ''α'' ≤ 2, and where the case ''α'' = 2 is equivalent to the multivariate normal distribution. It has an additional skew parameter that allows for non-symmetric distributions, where the
multivariate normal distribution
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
is symmetric.
Definition
Let
be the unit sphere in
. A
random vector
In probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its valu ...
,
, has a multivariate stable distribution - denoted as
-, if the joint characteristic function of
is
:
where 0 < ''α'' < 2, and for
:
This is essentially the result of Feldheim, that any stable random vector can be characterized by a spectral measure
(a finite measure on
) and a shift vector
.
Parametrization using projections
Another way to describe a stable random vector is in terms of projections. For any vector
, the projection
is univariate
stable with some skewness
, scale
and some shift
. The notation
is used if X is stable with
for every
. This is called the projection parameterization.
The spectral measure determines the projection parameter functions by:
:
:
:
Special cases
There are special cases where the multivariate
characteristic function In mathematics, the term "characteristic function" can refer to any of several distinct concepts:
* The indicator function of a subset, that is the function
::\mathbf_A\colon X \to \,
:which for a given subset ''A'' of ''X'', has value 1 at point ...
takes a simpler form. Define the characteristic function of a stable marginal as
:
Isotropic multivariate stable distribution
The characteristic function is
The spectral measure is continuous and uniform, leading to radial/isotropic symmetry.
For the multinormal case
, this corresponds to independent components, but so is not the case when
. Isotropy is a special case of ellipticity (see the next paragraph) – just take
to be a multiple of the identity matrix.
Elliptically contoured multivariate stable distribution
The
elliptically contoured multivariate stable distribution is a special symmetric case of the multivariate stable distribution.
If ''X'' is ''α''-stable and elliptically contoured, then it has joint
characteristic function In mathematics, the term "characteristic function" can refer to any of several distinct concepts:
* The indicator function of a subset, that is the function
::\mathbf_A\colon X \to \,
:which for a given subset ''A'' of ''X'', has value 1 at point ...
for some shift vector
(equal to the mean when it exists) and some positive definite matrix
(akin to a correlation matrix, although the usual definition of correlation fails to be meaningful).
Note the relation to characteristic function of the
multivariate normal distribution
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
:
obtained when ''α'' = 2.
Independent components
The marginals are independent with
, then the
characteristic function is
:
Observe that when ''α'' = 2 this reduces again to the multivariate normal; note that the iid case and the isotropic case do not coincide when ''α'' < 2.
Independent components is a special case of discrete spectral measure (see next paragraph), with the spectral measure supported by the standard unit vectors.
Discrete
If the spectral measure is discrete with mass
at
the characteristic function is
:
Linear properties
If
is ''d''-dimensional, ''A'' is an ''m'' x ''d'' matrix, and
then ''AX + b'' is ''m''-dimensional
-stable with scale function
skewness function
and location function
Inference in the independent component model
Recently
[D. Bickson and C. Guestrin. Inference in linear models with multivariate heavy-tails. In Neural Information Processing Systems (NIPS) 2010, Vancouver, Canada, Dec. 2010. https://www.cs.cmu.edu/~bickson/stable/] it was shown how to compute inference in closed-form in a linear model (or equivalently a
factor analysis
Factor analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. For example, it is possible that variations in six observed ...
model), involving independent component models.
More specifically, let
be a set of i.i.d. unobserved univariate drawn from a
stable distribution
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be sta ...
. Given a known linear relation matrix A of size
, the observation
are assumed to be distributed as a convolution of the hidden factors
.
. The inference task is to compute the most probable
, given the linear relation matrix A and the observations
. This task can be computed in closed-form in O(''n''
3).
An application for this construction is
multiuser detection with stable, non-Gaussian noise.
See also
*
Multivariate Cauchy distribution
*
Multivariate normal distribution
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
Resources
* Mark Veillette's stable distribution matlab package http://www.mathworks.com/matlabcentral/fileexchange/37514
* The plots in this page where plotted using Danny Bickson's inference in linear-stable model Matlab package: https://www.cs.cmu.edu/~bickson/stable
Notes
{{DEFAULTSORT:Multivariate Stable Distribution
Multivariate continuous distributions
Probability distributions with non-finite variance