In
probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, the generalized chi-squared distribution (or generalized chi-square distribution) is the distribution of a quadratic function of a
multinormal variable (normal vector), or a linear combination of different normal variables and squares of normal variables. Equivalently, it is also a linear sum of independent
noncentral chi-square variables and a
normal variable. There are several other such generalizations for which the same term is sometimes used; some of them are special cases of the family discussed here, for example the
gamma distribution
In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the g ...
.
Definition
The generalized chi-squared variable may be described in multiple ways. One is to write it as a weighted sum of independent
noncentral chi-square variables
and a standard normal variable
:
:
Here the parameters are the weights
, the degrees of freedom
and non-centralities
of the constituent non-central chi-squares, and the coefficients
and
of the normal. Some important special cases of this have all weights
of the same sign, or have central chi-squared components, or omit the normal term.
Since a non-central chi-squared variable is a sum of squares of normal variables with different means, the generalized chi-square variable is also defined as a sum of squares of independent normal variables, plus an independent normal variable: that is, a quadratic in normal variables.
Another equivalent way is to formulate it as a quadratic form of a normal vector
:
:
.
Here
is a matrix,
is a vector, and
is a scalar. These, together with the mean
and covariance matrix
of the normal vector
, parameterize the distribution.
For the most general case, a reduction towards a common standard form can be made by using a representation of the following form:
:
where ''D'' is a diagonal matrix and where ''x'' represents a vector of uncorrelated
standard normal
In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is
f(x) = \frac e^ ...
random variables.
Parameter conversions
A generalized chi-square variable or distribution can be parameterized in two ways. The first is in terms of the weights
, the degrees of freedom
and non-centralities
of the constituent non-central chi-squares, and the coefficients
and
of the added normal term. The second parameterization is using the quadratic form of a normal vector, where the paremeters are the matrix
, the vector
, and the scalar
, and the mean
and covariance matrix
of the normal vector.
The parameters of the first expression (in terms of non-central chi-squares, a normal and a constant) can be calculated in terms of the parameters of the second expression (quadratic form of a normal vector).
The parameters of the second expression (quadratic form of a normal vector) can also be calculated in terms of the parameters of the first expression (in terms of non-central chi-squares, a normal and a constant).
There exist
Matlab codeto convert from one set of parameters to another.
Computing the PDF/CDF/inverse CDF/random numbers
The probability density, cumulative distribution, and inverse cumulative distribution functions of a generalized chi-squared variable do not have simple closed-form expressions. But there exist several methods to compute them numerically: Ruben's method, Imhof's method,
IFFT method,
ray method,
ellipse approximation,
infinite-tail approximation,
Pearson approximation and Liu-Tang-Zhang approximation.
Numerical algorithms
and computer code
MatlabPythonJulia have been published that implement some of these methods to compute the PDF, CDF, and inverse CDF, and to generate random numbers.
The following table shows the best methods to use to compute the CDF and PDF for the different parts of the generalized chi-square distribution in different cases.
'Tail' refers to the infinite-tail approximation.
Applications
The generalized chi-squared is the distribution of
statistical estimates in cases where the usual
statistical theory
The theory of statistics provides a basis for the whole range of techniques, in both study design and data analysis, that are used within applications of statistics.
The theory covers approaches to statistical-decision problems and to statistica ...
does not hold, as in the examples below.
In model fitting and selection
If a
predictive model
Predictive modelling uses statistics to predict outcomes. Most often the event one wants to predict is in the future, but predictive modelling can be applied to any type of unknown event, regardless of when it occurred. For example, predictive mod ...
is fitted by
least squares
The method of least squares is a mathematical optimization technique that aims to determine the best fit function by minimizing the sum of the squares of the differences between the observed values and the predicted values of the model. The me ...
, but the
residuals have either
autocorrelation
Autocorrelation, sometimes known as serial correlation in the discrete time case, measures the correlation of a signal with a delayed copy of itself. Essentially, it quantifies the similarity between observations of a random variable at differe ...
or
heteroscedasticity
In statistics, a sequence of random variables is homoscedastic () if all its random variables have the same finite variance; this is also known as homogeneity of variance. The complementary notion is called heteroscedasticity, also known as hete ...
, then alternative models can be compared (in
model selection
Model selection is the task of selecting a model from among various candidates on the basis of performance criterion to choose the best one.
In the context of machine learning and more generally statistical analysis, this may be the selection of ...
) by relating changes in the
sum of squares to an
asymptotically valid generalized chi-squared distribution.
Classifying normal vectors using Gaussian discriminant analysis
If
is a normal vector, its log likelihood is a
quadratic form
In mathematics, a quadratic form is a polynomial with terms all of degree two (" form" is another name for a homogeneous polynomial). For example,
4x^2 + 2xy - 3y^2
is a quadratic form in the variables and . The coefficients usually belong t ...
of
, and is hence distributed as a generalized chi-squared. The log likelihood ratio that
arises from one normal distribution versus another is also a
quadratic form
In mathematics, a quadratic form is a polynomial with terms all of degree two (" form" is another name for a homogeneous polynomial). For example,
4x^2 + 2xy - 3y^2
is a quadratic form in the variables and . The coefficients usually belong t ...
, so distributed as a generalized chi-squared.
In Gaussian discriminant analysis, samples from multinormal distributions are optimally separated by using a
quadratic classifier
In statistics, a quadratic classifier is a statistical classifier that uses a quadratic decision surface to separate measurements of two or more classes of objects or events. It is a more general version of the linear classifier.
The classific ...
, a boundary that is a quadratic function (e.g. the curve defined by setting the likelihood ratio between two Gaussians to 1). The classification error rates of different types (false positives and false negatives) are integrals of the normal distributions within the quadratic regions defined by this classifier. Since this is mathematically equivalent to integrating a quadratic form of a normal vector, the result is an integral of a generalized-chi-squared variable.
In signal processing
The following application arises in the context of
Fourier analysis
In mathematics, Fourier analysis () is the study of the way general functions may be represented or approximated by sums of simpler trigonometric functions. Fourier analysis grew from the study of Fourier series, and is named after Joseph Fo ...
in
signal processing
Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, Scalar potential, potential fields, Seismic tomograph ...
,
renewal theory
Renewal theory is the branch of probability theory that generalizes the Poisson process for arbitrary holding times. Instead of exponentially distributed holding times, a renewal process may have any independent and identically distributed (IID) h ...
in
probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, and
multi-antenna systems in
wireless communication
Wireless communication (or just wireless, when the context allows) is the transfer of information (''telecommunication'') between two or more points without the use of an electrical conductor, optical fiber or other continuous guided med ...
. The common factor of these areas is that the sum of exponentially distributed variables is of importance (or identically, the sum of squared magnitudes of
circularly-symmetric centered complex Gaussian variables).
If
are ''k''
independent
Independent or Independents may refer to:
Arts, entertainment, and media Artist groups
* Independents (artist group), a group of modernist painters based in Pennsylvania, United States
* Independentes (English: Independents), a Portuguese artist ...
,
circularly-symmetric centered complex Gaussian random variables with
mean
A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
0 and
variance
In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
, then the random variable
:
has a generalized chi-squared distribution of a particular form. The difference from the standard chi-squared distribution is that
are complex and can have different variances, and the difference from the more general generalized chi-squared distribution is that the relevant scaling matrix ''A'' is diagonal. If
for all ''i'', then
, scaled down by
(i.e. multiplied by
), has a
chi-squared distribution
In probability theory and statistics, the \chi^2-distribution with k Degrees of freedom (statistics), degrees of freedom is the distribution of a sum of the squares of k Independence (probability theory), independent standard normal random vari ...
,
, also known as an
Erlang distribution
The Erlang distribution is a two-parameter family of continuous probability distributions with Support (mathematics), support x \in [0, \infty). The two parameters are:
* a positive integer k, the "shape", and
* a positive real number \lambda, ...
. If
have distinct values for all ''i'', then
has the pdf
:
If there are sets of repeated variances among
, assume that they are divided into ''M'' sets, each representing a certain variance value. Denote
to be the number of repetitions in each group. That is, the ''m''th set contains
variables that have variance
It represents an arbitrary linear combination of independent
-distributed random variables with different degrees of freedom:
:
The pdf of
is
[E. Björnson, D. Hammarwall, B. Ottersten (2009]
"Exploiting Quantized Channel Norm Feedback through Conditional Statistics in Arbitrarily Correlated MIMO Systems"
''IEEE Transactions on Signal Processing'', 57, 4027–4041
:
where
:
with
from the set
of
all partitions of
(with
) defined as
:
See also
* Noncentral chi-squared distribution
*
Chi-squared distribution
In probability theory and statistics, the \chi^2-distribution with k Degrees of freedom (statistics), degrees of freedom is the distribution of a sum of the squares of k Independence (probability theory), independent standard normal random vari ...
References
External links
Davies, R.B.: Fortran and C source code for "Linear combination of chi-squared random variables"Das, A: MATLAB code to compute the statistics, pdf, cdf, inverse cdf and random numbers of the generalized chi-square distribution.
{{ProbDistributions
Continuous distributions