HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
, the probability generating function of a
discrete random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
is a
power series In mathematics, a power series (in one variable) is an infinite series of the form \sum_^\infty a_n \left(x - c\right)^n = a_0 + a_1 (x - c) + a_2 (x - c)^2 + \dots where ''an'' represents the coefficient of the ''n''th term and ''c'' is a const ...
representation (the
generating function In mathematics, a generating function is a way of encoding an infinite sequence of numbers () by treating them as the coefficients of a formal power series. This series is called the generating function of the sequence. Unlike an ordinary serie ...
) of the
probability mass function In probability and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes it is also known as the discrete density function. The probability ma ...
of the
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
. Probability generating functions are often employed for their succinct description of the sequence of probabilities Pr(''X'' = ''i'') in the
probability mass function In probability and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes it is also known as the discrete density function. The probability ma ...
for a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
''X'', and to make available the well-developed theory of power series with non-negative coefficients.


Definition


Univariate case

If ''X'' is a
discrete random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
taking values in the non-negative
integer An integer is the number zero (), a positive natural number (, , , etc.) or a negative integer with a minus sign ( −1, −2, −3, etc.). The negative numbers are the additive inverses of the corresponding positive numbers. In the languag ...
s , then the ''probability generating function'' of ''X'' is defined as http://www.am.qub.ac.uk/users/g.gribakin/sor/Chap3.pdf :G(z) = \operatorname (z^X) = \sum_^p(x)z^x, where ''p'' is the
probability mass function In probability and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes it is also known as the discrete density function. The probability ma ...
of ''X''. Note that the subscripted notations ''G''''X'' and ''pX'' are often used to emphasize that these pertain to a particular random variable ''X'', and to its
distribution Distribution may refer to: Mathematics *Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations *Probability distribution, the probability of a particular value or value range of a varia ...
. The power series
converges absolutely In mathematics, an infinite series of numbers is said to converge absolutely (or to be absolutely convergent) if the sum of the absolute values of the summands is finite. More precisely, a real or complex series \textstyle\sum_^\infty a_n is sai ...
at least for all
complex number In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the form ...
s ''z'' with , ''z'',  ≤ 1; in many examples the radius of convergence is larger.


Multivariate case

If is a discrete random variable taking values in the ''d''-dimensional non-negative
integer lattice In mathematics, the -dimensional integer lattice (or cubic lattice), denoted , is the lattice in the Euclidean space whose lattice points are -tuples of integers. The two-dimensional integer lattice is also called the square lattice, or grid ...
''d'', then the ''probability generating function'' of ''X'' is defined as :G(z) = G(z_1,\ldots,z_d)=\operatorname\bigl (z_1^\cdots z_d^\bigr) = \sum_^p(x_1,\ldots,x_d)z_1^\cdots z_d^, where ''p'' is the probability mass function of ''X''. The power series converges absolutely at least for all complex vectors with .


Properties


Power series

Probability generating functions obey all the rules of power series with non-negative coefficients. In particular, ''G''(1) = 1, where ''G''(1) = limz→1''G''(''z'') from below, since the probabilities must sum to one. So the
radius of convergence In mathematics, the radius of convergence of a power series is the radius of the largest disk at the center of the series in which the series converges. It is either a non-negative real number or \infty. When it is positive, the power series ...
of any probability generating function must be at least 1, by
Abel's theorem In mathematics, Abel's theorem for power series relates a limit of a power series to the sum of its coefficients. It is named after Norwegian mathematician Niels Henrik Abel. Theorem Let the Taylor series G (x) = \sum_^\infty a_k x^k be a p ...
for power series with non-negative coefficients.


Probabilities and expectations

The following properties allow the derivation of various basic quantities related to ''X'': # The probability mass function of ''X'' is recovered by taking
derivative In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value). Derivatives are a fundamental tool of calculus. ...
s of ''G,'' #:p(k) = \operatorname(X = k) = \frac. # It follows from Property 1 that if random variables ''X'' and ''Y'' have probability-generating functions that are equal, G_X = G_Y, then p_X = p_Y. That is, if ''X'' and ''Y'' have identical probability-generating functions, then they have identical distributions. # The normalization of the probability density function can be expressed in terms of the generating function by #:\operatorname G(1^-)=\sum_^\infty p(i)=1. #:The expectation of X is given by #:: \operatorname = G'(1^-). #:More generally, the ''k''th
factorial moment In probability theory, the factorial moment is a mathematical quantity defined as the expectation or average of the falling factorial of a random variable. Factorial moments are useful for studying non-negative integer-valued random variables,D. J ...
, \operatorname(X(X - 1) \cdots (X - k + 1)) of ''X'' is given by #::\operatorname\left frac\right= G^(1^-), \quad k \geq 0. #:So the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
of ''X'' is given by #::\operatorname(X)=G''(1^-) + G'(1^-) - \left '(1^-)\right 2. #:Finally, the ''k''th raw moment of X is given by #::\operatorname ^k= \left(z\frac\right)^k G(z) \Big, _ # G_X(e^t) = M_X(t) where ''X'' is a random variable, G_X(t) is the probability generating function (of ''X'') and M_X(t) is the
moment-generating function In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compar ...
(of ''X'') .


Functions of independent random variables

Probability generating functions are particularly useful for dealing with functions of
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
random variables. For example: * If ''X''1, ''X''2, ..., ''X''''N'' is a sequence of independent (and not necessarily identically distributed) random variables that take on natural-number values, and ::S_N = \sum_^N a_i X_i, :where the ''a''''i'' are constant natural numbers, then the probability generating function is given by ::G_(z) = \operatorname(z^) = \operatorname \left( z^ \right) = G_( z^)G_(z^)\cdots G_(z^). :For example, if ::S_N = \sum_^N X_i, :then the probability generating function, ''G''''S''''N''(''z''), is given by ::G_(z) = G_(z)G_(z)\cdots G_(z). :It also follows that the probability generating function of the difference of two independent random variables ''S'' = ''X''1 − ''X''2 is ::G_S(z) = G_(z) G_(1/z). *Suppose that ''N'', the number of independent random variables in the sum above, is not a fixed natural number but is also an independent, discrete random variable taking values on the non-negative integers, with probability generating function ''G''''N''. If the ''X''1, ''X''2, ..., ''X''''N'' are independent ''and'' identically distributed with common probability generating function ''G''''X'', then ::G_(z) = G_N(G_X(z)). :This can be seen, using the
law of total expectation The proposition in probability theory known as the law of total expectation, the law of iterated expectations (LIE), Adam's law, the tower rule, and the smoothing theorem, among other names, states that if X is a random variable whose expected ...
, as follows: :: \begin G_(z) & = \operatorname(z^) = \operatorname(z^) \\ pt& = \operatorname\big(\operatorname(z^ \mid N) \big) = \operatorname\big( (G_X(z))^N\big) =G_N(G_X(z)). \end :This last fact is useful in the study of Galton–Watson processes and
compound Poisson process A compound Poisson process is a continuous-time (random) stochastic process with jumps. The jumps arrive randomly according to a Poisson process and the size of the jumps is also random, with a specified probability distribution. A compound Poisson ...
es. *Suppose again that ''N'' is also an independent, discrete random variable taking values on the non-negative integers, with probability generating function ''G''''N'' and probability mass function f_i = \Pr\. If the ''X''1, ''X''2, ..., ''X''''N'' are independent, but ''not'' identically distributed random variables, where G_ denotes the probability generating function of X_i, then ::G_(z) = \sum_ f_i \prod_^i G_(z). :For identically distributed ''Xi'' this simplifies to the identity stated before. The general case is sometimes useful to obtain a decomposition of ''SN'' by means of generating functions.


Examples

* The probability generating function of an almost surely constant random variable, i.e. one with Pr(''X'' = ''c'') = 1, is ::G(z) = z^c. * The probability generating function of a binomial random variable, the number of successes in ''n'' trials, with probability ''p'' of success in each trial, is ::G(z) = \left 1-p) + pz\rightn. :Note that this is the ''n''-fold product of the probability generating function of a
Bernoulli random variable In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probabili ...
with parameter ''p''. :So the probability generating function of a
fair coin In probability theory and statistics, a sequence of independent Bernoulli trials with probability 1/2 of success on each trial is metaphorically called a fair coin. One for which the probability is not 1/2 is called a biased or unfair coin. In t ...
, is ::G(z) = 1/2 + z/2. * The probability generating function of a negative binomial random variable on , the number of failures until the ''r''th success with probability of success in each trial ''p'', is ::G(z) = \left(\frac\right)^r. :(Convergence for , z, < \frac). :Note that this is the ''r''-fold product of the probability generating function of a geometric random variable with parameter 1 − ''p'' on . * The probability generating function of a Poisson random variable with rate parameter ''λ'' is ::G(z) = e^.


Related concepts

The probability generating function is an example of a
generating function In mathematics, a generating function is a way of encoding an infinite sequence of numbers () by treating them as the coefficients of a formal power series. This series is called the generating function of the sequence. Unlike an ordinary serie ...
of a sequence: see also
formal power series In mathematics, a formal series is an infinite sum that is considered independently from any notion of convergence, and can be manipulated with the usual algebraic operations on series (addition, subtraction, multiplication, division, partial sum ...
. It is equivalent to, and sometimes called, the
z-transform In mathematics and signal processing, the Z-transform converts a discrete-time signal, which is a sequence of real or complex numbers, into a complex frequency-domain (z-domain or z-plane) representation. It can be considered as a discrete-ti ...
of the probability mass function. Other generating functions of random variables include the
moment-generating function In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compar ...
, the
characteristic function In mathematics, the term "characteristic function" can refer to any of several distinct concepts: * The indicator function of a subset, that is the function ::\mathbf_A\colon X \to \, :which for a given subset ''A'' of ''X'', has value 1 at points ...
and the cumulant generating function. The probability generating function is also equivalent to the factorial moment generating function, which as \operatorname\left ^X\right/math> can also be considered for continuous and other random variables.


Notes


References

*Johnson, N.L.; Kotz, S.; Kemp, A.W. (1993) ''Univariate Discrete distributions'' (2nd edition). Wiley. (Section 1.B9) {{DEFAULTSORT:Probability Generating Function Functions related to probability distributions Generating functions