Kolmogorov's three-series theorem
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Kolmogorov's Three-Series Theorem, named after
Andrey Kolmogorov Andrey Nikolaevich Kolmogorov ( rus, Андре́й Никола́евич Колмого́ров, p=ɐnˈdrʲej nʲɪkɐˈlajɪvʲɪtɕ kəlmɐˈɡorəf, a=Ru-Andrey Nikolaevich Kolmogorov.ogg, 25 April 1903 – 20 October 1987) was a Sovi ...
, gives a criterion for the
almost sure In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0. ...
convergence Convergence may refer to: Arts and media Literature *''Convergence'' (book series), edited by Ruth Nanda Anshen * "Convergence" (comics), two separate story lines published by DC Comics: **A four-part crossover storyline that united the four Wei ...
of an
infinite series In mathematics, a series is, roughly speaking, a description of the operation of adding infinitely many quantities, one after the other, to a given starting quantity. The study of series is a major part of calculus and its generalization, math ...
of
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s in terms of the convergence of three different series involving properties of their
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
s. Kolmogorov's three-series theorem, combined with
Kronecker's lemma In mathematics, Kronecker's lemma (see, e.g., ) is a result about the relationship between convergence of infinite sums and convergence of sequences. The lemma is often used in the proofs of theorems concerning sums of independent random variables s ...
, can be used to give a relatively easy proof of the
Strong Law of Large Numbers In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials shou ...
.


Statement of the theorem

Let (X_n)_ be
independent random variables Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
. The random series \sum_^\infty X_n converges
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0. ...
in \mathbb if the following conditions hold for some A > 0, and only if the following conditions hold for any A > 0:


Proof


Sufficiency of conditions ("if")

Condition (i) and Borel–Cantelli give that X_n = Y_n for n large,
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0. ...
. Hence \textstyle\sum_^X_n converges if and only if \textstyle\sum_^Y_n converges. Conditions (ii)-(iii) and Kolmogorov's Two-Series Theorem give the almost sure convergence of \textstyle\sum_^Y_n.


Necessity of conditions ("only if")

Suppose that \textstyle\sum_^X_n converges almost surely. Without condition (i), by Borel–Cantelli there would exist some A > 0 such that \ for infinitely many n, almost surely. But then the series would diverge. Therefore, we must have condition (i). We see that condition (iii) implies condition (ii): Kolmogorov's two-series theorem along with condition (i) applied to the case A = 1 gives the convergence of \textstyle\sum_^(Y_n - \mathbb _n. So given the convergence of \textstyle\sum_^Y_n, we have \textstyle\sum_^\mathbb _n/math> converges, so condition (ii) is implied. Thus, it only remains to demonstrate the necessity of condition (iii), and we will have obtained the full result. It is equivalent to check condition (iii) for the series \textstyle\sum_^Z_n = \textstyle\sum_^(Y_n - Y'_n) where for each n, Y_n and Y'_n are
IID In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independence (probability theory), ...
—that is, to employ the assumption that \mathbb _n= 0, since Z_n is a sequence of random variables bounded by 2, converging almost surely, and with \mathrm(Z_n) = 2\mathrm(Y_n). So we wish to check that if \textstyle\sum_^Z_n converges, then \textstyle\sum_^\mathrm(Z_n) converges as well. This is a special case of a more general result from
martingale theory In probability theory, a martingale is a sequence of random variables (i.e., a stochastic process) for which, at a particular time, the conditional expectation of the next value in the sequence is equal to the present value, regardless of all pri ...
with summands equal to the increments of a martingale sequence and the same conditions (\mathbb _n= 0; the series of the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
s is converging; and the summands are
bounded Boundedness or bounded may refer to: Economics * Bounded rationality, the idea that human rationality in decision-making is bounded by the available information, the cognitive limitations, and the time available to make the decision * Bounded e ...
).W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) pp. Sect. IX.9


Example

As an illustration of the theorem, consider the example of the '' harmonic series with random signs'': : \sum_^\infty \pm \frac. Here, "\pm" means that each term 1/n is taken with a random sign that is either 1 or -1 with respective probabilities 1/2,\ 1/2, and all random signs are chosen independently. Let X_n in the theorem denote a random variable that takes the values 1/n and -1/n with equal probabilities. With A=2 the summands of the first two series are identically zero and var(Yn)=n^. The conditions of the theorem are then satisfied, so it follows that the harmonic series with random signs converges almost surely. On the other hand, the analogous series of (for example) square root reciprocals with random signs, namely : \sum_^\infty \pm \frac, ''diverges'' almost surely, since condition (3) in the theorem is not satisfied for any A. Note that this is different from the behavior of the analogous series with ''alternating'' signs, \sum_^\infty (-1)^n/\sqrt{n}, which does converge.


Notes

Mathematical series Probability theorems