HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
s with
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) c ...
:f(x) = \frac x^ e^,\qquad x>0, where ''Kp'' is a
modified Bessel function Bessel functions, first defined by the mathematician Daniel Bernoulli and then generalized by Friedrich Bessel, are canonical solutions of Bessel's differential equation x^2 \frac + x \frac + \left(x^2 - \alpha^2 \right)y = 0 for an arbitrary ...
of the second kind, ''a'' > 0, ''b'' > 0 and ''p'' a real parameter. It is used extensively in
geostatistics Geostatistics is a branch of statistics focusing on spatial or spatiotemporal datasets. Developed originally to predict probability distributions of ore grades for mining operations, it is currently applied in diverse disciplines including pet ...
, statistical linguistics, finance, etc. This distribution was first proposed by
Étienne Halphen Étienne Halphen (27 May 1911, in Bordeaux – 11 August 1954, in Neuilly-sur-Marne) was a French mathematician. He was known for his work in geometry, on probability distributions and information theory. Biography He was born as son of Germai ...
. It was rediscovered and popularised by
Ole Barndorff-Nielsen Ole Eiler Barndorff-Nielsen (18 March, 1935 – 26 June, 2022) was a Denmark, Danish statistician who has contributed to many areas of statistics, statistical science. Education and career He was born in Copenhagen, and became interested in st ...
, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes.


Properties


Alternative parametrization

By setting \theta = \sqrt and \eta = \sqrt, we can alternatively express the GIG distribution as :f(x) = \frac \left(\frac\right)^ e^, where \theta is the concentration parameter while \eta is the scaling parameter.


Summation

Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible.


Entropy

The entropy of the generalized inverse Gaussian distribution is given as : \begin H = \frac \log \left( \frac b a \right) & +\log \left(2 K_p\left(\sqrt \right)\right) - (p-1) \frac \\ & + \frac\left( K_\left(\sqrt\right) + K_\left(\sqrt\right)\right) \end where \left fracK_\nu\left(\sqrt\right)\right is a derivative of the modified Bessel function of the second kind with respect to the order \nu evaluated at \nu=p


Characteristic Function

The characteristic of a random variable X\sim GIG(p, a, b) is given as(for a derivation of the characteristic function, see supplementary materials of ) : E(e^) = \left(\frac\right)^ \frac for t \in \mathbb where i denotes the
imaginary number An imaginary number is a real number multiplied by the imaginary unit , is usually used in engineering contexts where has other meanings (such as electrical current) which is defined by its property . The square of an imaginary number is . Fo ...
.


Related distributions


Special cases

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for ''p'' = âˆ’1/2 and ''b'' = 0, respectively. Specifically, an inverse Gaussian distribution of the form : f(x;\mu,\lambda) = \left frac\right \exp is a GIG with a = \lambda/\mu^2, b = \lambda, and p=-1/2. A Gamma distribution of the form : g(x;\alpha,\beta) = \beta^\alpha \frac 1 x^ e^ is a GIG with a = 2 \beta, b = 0, and p = \alpha. Other special cases include the
inverse-gamma distribution In probability theory and statistics, the inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according t ...
, for ''a'' = 0.


Conjugate prior for Gaussian

The GIG distribution is conjugate to the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu i ...
when serving as the mixing distribution in a
normal variance-mean mixture In probability theory and statistics, a normal variance-mean mixture with mixing probability density g is the continuous probability distribution of a random variable Y of the form :Y=\alpha + \beta V+\sigma \sqrtX, where \alpha, \beta and \sigma ...
. Let the prior distribution for some hidden variable, say z, be GIG: : P(z\mid a,b,p) = \operatorname(z\mid a,b,p) and let there be T observed data points, X=x_1,\ldots,x_T, with normal likelihood function, conditioned on z: : P(X\mid z,\alpha,\beta) = \prod_^T N(x_i\mid\alpha+\beta z,z) where N(x\mid\mu,v) is the normal distribution, with mean \mu and variance v. Then the posterior for z, given the data is also GIG: : P(z\mid X,a,b,p,\alpha,\beta) = \text\left(z\mid a+T\beta^2,b+S,p-\frac T 2 \right) where \textstyle S = \sum_^T (x_i-\alpha)^2.Due to the conjugacy, these details can be derived without solving integrals, by noting that :P(z\mid X,a,b,p,\alpha,\beta)\propto P(z\mid a,b,p)P(X\mid z,\alpha,\beta). Omitting all factors independent of z, the right-hand-side can be simplified to give an ''un-normalized'' GIG distribution, from which the posterior parameters can be identified.


Sichel distribution

The Sichel distributionStein, Gillian Z., Walter Zucchini, and June M. Juritz, 1987. "Parameter estimation for the Sichel distribution and its multivariate extension." Journal of the American Statistical Association 82.399: 938-944. results when the GIG is used as the mixing distribution for the Poisson parameter \lambda.


Notes


References


See also

* Inverse Gaussian distribution *
Gamma distribution In probability theory and statistics, the gamma distribution is a two- parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-square distribution are special cases of the gamma dis ...
{{DEFAULTSORT:Generalized Inverse Gaussian Distribution Continuous distributions Exponential family distributions