In mathematics, a series (mathematics), series is the summation, sum of the terms of an infinite sequence of numbers. More precisely, an infinite sequence $(a\_0,\; a\_1,\; a\_2,\; \backslash ldots)$ defines a series (mathematics), series that is denoted
:$S=a\_0\; +a\_1+\; a\_2\; +\; \backslash cdots=\backslash sum\_^\backslash infty\; a\_k.$
The th partial sum is the sum of the first terms of the sequence; that is,
:$S\_n\; =\; \backslash sum\_^n\; a\_k.$
A series is convergent (or converges) if the sequence $(S\_1,\; S\_2,\; S\_3,\; \backslash dots)$ of its partial sums tends to a limit of a sequence, limit; that means that, when adding one $a\_k$ after the other ''in the order given by the indices'', one gets partial sums that become closer and closer to a given number. More precisely, a series converges, if there exists a number $\backslash ell$ such that for every arbitrarily small positive number $\backslash varepsilon$, there is a (sufficiently large) integer $N$ such that for all $n\; \backslash ge\; N$,
:$\backslash left\; ,\; S\_n\; -\; \backslash ell\; \backslash right\; ,\; <\; \backslash varepsilon.$
If the series is convergent, the (necessarily unique) number $\backslash ell$ is called the ''sum of the series''.
The same notation
:$\backslash sum\_^\backslash infty\; a\_k$
is used for the series, and, if it is convergent, to its sum. This convention is similar to that which is used for addition: denotes the ''operation of adding and '' as well as the result of this ''addition'', which is called the ''sum'' of and .
Any series that is not convergent is said to be Divergent series, divergent or to diverge.

Riemann Series Theorem

Retrieved May 16, 2005. {{Series (mathematics) Mathematical series Convergence (mathematics)

Examples of convergent and divergent series

* The reciprocals of the natural number, positive integers produce a divergent series (harmonic series (mathematics), harmonic series): *: $++++++\backslash cdots\; \backslash rightarrow\; \backslash infty.$ * Alternating the signs of the reciprocals of positive integers produces a convergent series (alternating harmonic series): *:$-+-+-\backslash cdots\; =\; \backslash ln(2)$ * The reciprocals of prime numbers produce a divergent series (so the set of primes is "Small set (combinatorics), large"; see divergence of the sum of the reciprocals of the primes): *: $++++++\backslash cdots\; \backslash rightarrow\; \backslash infty.$ * The reciprocals of triangular numbers produce a convergent series: *: $++++++\backslash cdots\; =\; 2.$ * The reciprocals of factorials produce a convergent series (see e (mathematical constant), e): *: $\backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash cdots\; =\; e.$ * The reciprocals of square numbers produce a convergent series (the Basel problem): *: $++++++\backslash cdots\; =\; .$ * The reciprocals of power of two, powers of 2 produce a convergent series (so the set of powers of 2 is "Small set (combinatorics), small"): *: $++++++\backslash cdots\; =\; 2.$ * The reciprocals of geometric series, powers of any n>1 produce a convergent series: *: $++++++\backslash cdots\; =\; .$ * Alternating the signs of reciprocals of power of two, powers of 2 also produces a convergent series: *: $-+-+-+\backslash cdots\; =\; .$ * Alternating the signs of reciprocals of powers of any n>1 produces a convergent series: *: $-+-+-+\backslash cdots\; =\; .$ * The reciprocals of Fibonacci numbers produce a convergent series (see reciprocal Fibonacci constant, ψ): *: $\backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash frac\; +\; \backslash cdots\; =\; \backslash psi.$Convergence tests

There are a number of methods of determining whether a series converges or divergent series, diverges. Direct comparison test, Comparison test. The terms of the sequence $\backslash left\; \backslash $ are compared to those of another sequence $\backslash left\; \backslash $. If, for all ''n'', $0\; \backslash le\; \backslash \; a\_n\; \backslash le\; \backslash \; b\_n$, and $\backslash sum\_^\backslash infty\; b\_n$ converges, then so does $\backslash sum\_^\backslash infty\; a\_n.$ However, if, for all ''n'', $0\; \backslash le\; \backslash \; b\_n\; \backslash le\; \backslash \; a\_n$, and $\backslash sum\_^\backslash infty\; b\_n$ diverges, then so does $\backslash sum\_^\backslash infty\; a\_n.$ Ratio test. Assume that for all ''n'', $a\_n$ is not zero. Suppose that there exists $r$ such that :$\backslash lim\_\; \backslash left,\; \backslash \; =\; r.$ If ''r'' < 1, then the series is absolutely convergent. If then the series diverges. If the ratio test is inconclusive, and the series may converge or diverge. Root test or ''n''th root test. Suppose that the terms of the sequence in question are non-negative. Define ''r'' as follows: :$r\; =\; \backslash limsup\_\backslash sqrt[n],$ :where "lim sup" denotes the limit superior (possibly ∞; if the limit exists it is the same value). If ''r'' < 1, then the series converges. If then the series diverges. If the root test is inconclusive, and the series may converge or diverge. The ratio test and the root test are both based on comparison with a geometric series, and as such they work in similar situations. In fact, if the ratio test works (meaning that the limit exists and is not equal to 1) then so does the root test; the converse, however, is not true. The root test is therefore more generally applicable, but as a practical matter the limit is often difficult to compute for commonly seen types of series. Integral test for convergence, Integral test. The series can be compared to an integral to establish convergence or divergence. Let $f(n)\; =\; a\_n$ be a positive and monotonic function, monotonically decreasing function. If :$\backslash int\_^\; f(x)\backslash ,\; dx\; =\; \backslash lim\_\; \backslash int\_^\; f(x)\backslash ,\; dx\; <\; \backslash infty,$ then the series converges. But if the integral diverges, then the series does so as well. Limit comparison test. If $\backslash left\; \backslash ,\; \backslash left\; \backslash \; >\; 0$, and the limit $\backslash lim\_\; \backslash frac$ exists and is not zero, then $\backslash sum\_^\backslash infty\; a\_n$ converges if and only if $\backslash sum\_^\backslash infty\; b\_n$ converges. Alternating series test. Also known as the ''Leibniz criterion'', the alternating series test states that for an alternating series of the form $\backslash sum\_^\backslash infty\; a\_n\; (-1)^n$, if $\backslash left\; \backslash $ is monotonically decreasing, and has a limit of 0 at infinity, then the series converges. Cauchy condensation test. If $\backslash left\; \backslash $ is a positive monotone decreasing sequence, then $\backslash sum\_^\backslash infty\; a\_n$ converges if and only if $\backslash sum\_^\backslash infty\; 2^k\; a\_$ converges. Dirichlet's test Abel's testConditional and absolute convergence

For any sequence $\backslash left\; \backslash $, $a\_n\; \backslash le\; \backslash left,\; a\_n\; \backslash $ for all ''n''. Therefore, :$\backslash sum\_^\backslash infty\; a\_n\; \backslash le\; \backslash sum\_^\backslash infty\; \backslash left,\; a\_n\; \backslash .$ This means that if $\backslash sum\_^\backslash infty\; \backslash left,\; a\_n\; \backslash $ converges, then $\backslash sum\_^\backslash infty\; a\_n$ also converges (but not vice versa). If the series $\backslash sum\_^\backslash infty\; \backslash left,\; a\_n\; \backslash $ converges, then the series $\backslash sum\_^\backslash infty\; a\_n$ is absolutely convergent. The Maclaurin series of the exponential function is absolutely convergent for every complex number, complex value of the variable. If the series $\backslash sum\_^\backslash infty\; a\_n$ converges but the series $\backslash sum\_^\backslash infty\; \backslash left,\; a\_n\; \backslash $ diverges, then the series $\backslash sum\_^\backslash infty\; a\_n$ is conditionally convergent. The Maclaurin series of the logarithm function $\backslash ln(1+x)$ is conditionally convergent for . The Riemann series theorem states that if a series converges conditionally, it is possible to rearrange the terms of the series in such a way that the series converges to any value, or even diverges.Uniform convergence

Let $\backslash left\; \backslash $ be a sequence of functions. The series $\backslash sum\_^\backslash infty\; f\_n$ is said to converge uniformly to ''f'' if the sequence $\backslash $ of partial sums defined by : $s\_n(x)\; =\; \backslash sum\_^n\; f\_k\; (x)$ converges uniformly to ''f''. There is an analogue of the comparison test for infinite series of functions called the Weierstrass M-test.Cauchy convergence criterion

The Cauchy's convergence test, Cauchy convergence criterion states that a series :$\backslash sum\_^\backslash infty\; a\_n$ converges if and only if the sequence of partial sums is a Cauchy sequence. This means that for every $\backslash varepsilon\; >\; 0,$ there is a positive integer $N$ such that for all $n\; \backslash geq\; m\; \backslash geq\; N$ we have :$\backslash left,\; \backslash sum\_^n\; a\_k\; \backslash \; <\; \backslash varepsilon,$ which is equivalent to :$\backslash lim\_\; \backslash sum\_^\; a\_k\; =\; 0.$See also

* Normal convergence * List of mathematical seriesExternal links

* * Weisstein, Eric (2005)Riemann Series Theorem

Retrieved May 16, 2005. {{Series (mathematics) Mathematical series Convergence (mathematics)