HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, the probability generating function of a
discrete random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers ...
is a
power series In mathematics, a power series (in one variable) is an infinite series of the form \sum_^\infty a_n \left(x - c\right)^n = a_0 + a_1 (x - c) + a_2 (x - c)^2 + \dots where ''a_n'' represents the coefficient of the ''n''th term and ''c'' is a co ...
representation (the
generating function In mathematics, a generating function is a representation of an infinite sequence of numbers as the coefficients of a formal power series. Generating functions are often expressed in closed form (rather than as a series), by some expression invo ...
) of the
probability mass function In probability and statistics, a probability mass function (sometimes called ''probability function'' or ''frequency function'') is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes i ...
of the
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
. Probability generating functions are often employed for their succinct description of the sequence of probabilities Pr(''X'' = ''i'') in the
probability mass function In probability and statistics, a probability mass function (sometimes called ''probability function'' or ''frequency function'') is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes i ...
for a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
''X'', and to make available the well-developed theory of power series with non-negative coefficients.


Definition


Univariate case

If ''X'' is a
discrete random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers ...
taking values ''x'' in the non-negative
integer An integer is the number zero (0), a positive natural number (1, 2, 3, ...), or the negation of a positive natural number (−1, −2, −3, ...). The negations or additive inverses of the positive natural numbers are referred to as negative in ...
s , then the ''probability generating function'' of ''X'' is defined as G(z) = \operatorname (z^X) = \sum_^ p(x) z^x, where p is the
probability mass function In probability and statistics, a probability mass function (sometimes called ''probability function'' or ''frequency function'') is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes i ...
of X. Note that the subscripted notations G_X and p_X are often used to emphasize that these pertain to a particular random variable X, and to its distribution. The power series converges absolutely at least for all
complex number In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the for ...
s z with , z, <1; the radius of convergence being often larger.


Multivariate case

If is a discrete random variable taking values in the -dimensional non-negative
integer lattice In mathematics, the -dimensional integer lattice (or cubic lattice), denoted , is the lattice (group), lattice in the Euclidean space whose lattice points are tuple, -tuples of integers. The two-dimensional integer lattice is also called the s ...
, then the ''probability generating function'' of is defined as G(z) = G(z_1,\ldots,z_d) = \operatorname\bigl (z_1^\cdots z_d^\bigr) = \sum_^p(x_1,\ldots,x_d) z_1^ \cdots z_d^, where is the probability mass function of . The power series converges absolutely at least for all complex vectors z = (z_1, ... z_d) \isin \mathbb^d with \text\ \le 1.


Properties


Power series

Probability generating functions obey all the rules of power series with non-negative coefficients. In particular, G(1^-) = 1, where G(1^-) = \lim_ G(x), x approaching 1 from below, since the probabilities must sum to one. So the radius of convergence of any probability generating function must be at least 1, by Abel's theorem for power series with non-negative coefficients.


Probabilities and expectations

The following properties allow the derivation of various basic quantities related to X: # The probability mass function of X is recovered by taking
derivative In mathematics, the derivative is a fundamental tool that quantifies the sensitivity to change of a function's output with respect to its input. The derivative of a function of a single variable at a chosen input value, when it exists, is t ...
s of G, p(k) = \operatorname(X = k) = \frac. # It follows from Property 1 that if random variables X and Y have probability-generating functions that are equal, G_X = G_Y, then p_X = p_Y. That is, if X and Y have identical probability-generating functions, then they have identical distributions. # The normalization of the probability mass function can be expressed in terms of the generating function by \operatorname = G(1^-) = \sum_^\infty p(i) = 1. The expectation of X is given by \operatorname = G'(1^-). More generally, the k^ factorial moment, \operatorname (X - 1) \cdots (X - k + 1)/math> of X is given by \operatorname\left frac\right= G^(1^-), \quad k \geq 0. So the
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
of X is given by \operatorname(X)=G''(1^-) + G'(1^-) - \left '(1^-)\right 2. Finally, the -th raw moment of X is given by \operatorname ^k= \left(z\frac\right)^k G(z) \Big, _ # G_X(e^t) = M_X(t) where ''X'' is a random variable, G_X(t) is the probability generating function (of X) and M_X(t) is the moment-generating function (of X).


Functions of independent random variables

Probability generating functions are particularly useful for dealing with functions of independent random variables. For example: * If X_i, i=1,2,\cdots,N is a sequence of independent (and not necessarily identically distributed) random variables that take on natural-number values, and S_N = \sum_^N a_i X_i, where the a_i are constant natural numbers, then the probability generating function is given by G_(z) = \operatorname(z^) = \operatorname \left( z^ \right) = G_( z^)G_(z^)\cdots G_(z^). * In particular, if X and Y are independent random variables: G_(z) = G_X(z) \cdot G_Y(z) and G_(z) = G_X(z) \cdot G_Y(1/z). * In the above, the number N of independent random variables in the sequence is fixed. Assume N is discrete random variable taking values on the non-negative integers, which is independent of the X_i, and consider the probability generating function G_N. If the X_i are not only independent but also identically distributed with common probability generating function G_X = G_, then G_(z) = G_N(G_X(z)). This can be seen, using the
law of total expectation The proposition in probability theory known as the law of total expectation, the law of iterated expectations (LIE), Adam's law, the tower rule, and the smoothing property of conditional expectation, among other names, states that if X is a random ...
, as follows: \begin G_(z) & = \operatorname(z^) = \operatorname(z^) \\ pt& = \operatorname\big(\operatorname(z^ \mid N) \big) = \operatorname\big( (G_X(z))^N\big) =G_N(G_X(z)). \end This last fact is useful in the study of
Galton–Watson process The Galton–Watson process, also called the Bienaymé-Galton-Watson process or the Galton-Watson branching process, is a branching stochastic process arising from Francis Galton's statistical investigation of the extinction of family names. The ...
es and
compound Poisson process A compound Poisson process is a continuous-time stochastic process with jumps. The jumps arrive randomly according to a Poisson process and the size of the jumps is also random, with a specified probability distribution. To be precise, a compound ...
es. * When the X_i are not supposed identically distributed (but still independent and independent of N), we have G_(z) = \sum_ f_n \prod_^n G_(z), where f_n = \Pr(N=n). For identically distributed X_is, this simplifies to the identity stated before, but the general case is sometimes useful to obtain a decomposition of S_N by means of generating functions.


Examples

* The probability generating function of an almost surely constant random variable, i.e. one with Pr(X=c) = 1 and Pr(X\neq c) = 0 is G(z) = z^c. * The probability generating function of a binomial random variable, the number of successes in n trials, with probability p of success in each trial, is G(z) = \left 1-p) + pz\rightn. Note: it is the n-fold product of the probability generating function of a Bernoulli random variable with parameter p. So the probability generating function of a
fair coin In probability theory and statistics, a sequence of Independence (probability theory), independent Bernoulli trials with probability 1/2 of success on each trial is metaphorically called a fair coin. One for which the probability is not 1/2 is ca ...
, is G(z) = 1/2 + z/2. * The probability generating function of a negative binomial random variable on \, the number of failures until the r^ success with probability of success in each trial p, is G(z) = \left(\frac\right)^r, which converges for , z, < \frac. Note that this is the r-fold product of the probability generating function of a geometric random variable with parameter 1-p on \. * The probability generating function of a Poisson random variable with rate parameter \lambda is G(z) = e^.


Related concepts

The probability generating function is an example of a
generating function In mathematics, a generating function is a representation of an infinite sequence of numbers as the coefficients of a formal power series. Generating functions are often expressed in closed form (rather than as a series), by some expression invo ...
of a sequence: see also
formal power series In mathematics, a formal series is an infinite sum that is considered independently from any notion of convergence, and can be manipulated with the usual algebraic operations on series (addition, subtraction, multiplication, division, partial su ...
. It is equivalent to, and sometimes called, the z-transform of the probability mass function. Other generating functions of random variables include the moment-generating function, the characteristic function and the
cumulant generating function In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have ...
. The probability generating function is also equivalent to the factorial moment generating function, which as \operatorname\left ^X\right/math> can also be considered for continuous and other random variables.


Notes


References

* {{DEFAULTSORT:Probability Generating Function Functions related to probability distributions Generating functions