Series (mathematics): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Aleksander Stos
imported>Jitse Niesen
Line 49: Line 49:


===Series with positive terms===
===Series with positive terms===
The simplest to investigate are the series with positive terms <math>(a_n\ge 0).</math> In this case, the cumulative sum can only increase. The only question is whether the growth has its limit. This simplifies the analysis and results in a number of basic criteria. Notice that it is not positivity that really matters. If a given series has ''only'' negative terms, a "symmetric" argument can always be applied. So the only thing we really need is the constant sign. However, for the sake of clarity, in this section we assume  that the terms are simply positive.
The simplest to investigate are the series with positive terms <math>(a_n\ge 0).</math> In this case, the cumulative sum can only increase. The only question is whether the growth has a limit. This simplifies the analysis and results in a number of basic criteria. Notice that it is not positivity that really matters. If a given series has ''only'' negative terms, a "symmetric" argument can always be applied. So the only thing we really need is the constant sign. However, for the sake of clarity, in this section we assume  that the terms are simply positive.


====Comparisons====
====Comparisons====
There is a "family" of relative tests. The nature of a given series is inferred from what we know about another series, possibly simpler one.
There is a "family" of relative tests. The nature of a given series is inferred from what we know about another series, possibly simpler one.


If for two series <math>\sum a_n</math> and <math>\sum b_n</math> we have <math>a_n \le b_n</math> and if the series <math>\sum b_n</math> is known to be convergent, then the other one, <math>\sum a_n,</math> is convergent as well. In other words a series "lesser" than a convergent one is always convergent. This is not difficult to justify. When we consider series of positive terms, the cumulative sum may only increase. For convergent series the growth is ''not'' unlimited -- there is an upper bound. Accordingly, the growth of the "lesser" series is limited by the same bound.  
If for two series <math>\sum a_n</math> and <math>\sum b_n</math> we have <math>a_n \le b_n</math> and if the series <math>\sum b_n</math> is known to be convergent, then the other one, <math>\sum a_n,</math> is convergent as well. In other words a series "smaller" than a convergent one is always convergent. This is not difficult to justify. When we consider series of positive terms, the cumulative sum may only increase. For convergent series the growth is ''not'' unlimited -- there is an upper bound. Accordingly, the growth of the "smaller" series is limited by the same bound.  


The above argument works the other way round too. If for the two series we have <math>a_n \ge b_n</math> and the series <math>\sum b_n</math> is known to be divergent, then the other one, <math>\sum a_n,</math> diverges as well. In other words a series "bigger" than a divergent one is divergent too.  
The above argument works the other way round too. If for the two series we have <math>a_n \ge b_n</math> and the series <math>\sum b_n</math> is known to be divergent, then the other one, <math>\sum a_n,</math> diverges as well. In other words a series "bigger" than a divergent one is divergent too.  
Line 62: Line 62:
with a finite strictly positive constant <math>c\in(0,\infty)</math>, then both series are of the same nature. In other words, if one series is known to converge, the other converges as well; if one diverges, so does the other.  
with a finite strictly positive constant <math>c\in(0,\infty)</math>, then both series are of the same nature. In other words, if one series is known to converge, the other converges as well; if one diverges, so does the other.  


Comparison with [[integral]]. This can be done if the ''sequence'' of general terms is decreasing, that is <math>a_n > a_{n+1}.</math> More precisely, suppose that for certain decreasing function ''f'' defined on <math>(1,\infty)</math> we have <math>f(n)=a_n,\, n=1,2,...,</math> and the integral  
Comparison with [[integral]]. This can be done if the ''sequence'' of general terms is decreasing, that is <math>a_n > a_{n+1}.</math> More precisely, suppose that we have <math>f(n)=a_n,\, n=1,2,...,</math> for a certain decreasing function ''f'' defined on <math>(1,\infty)</math>. If the integral  
:<math> \int_1^\infty f(x)dx </math>
:<math> \int_1^\infty f(x) \, dx </math>
converges, so does the series <math>\sum a_n.</math>  
converges, so does the series <math>\sum a_n.</math>  


This equivalence is quite important since it allows to establish convergence of a given series using basic methods of calculus. Prominent examples include the ''Riemann series'', that is the series of the general term <math>a_n=1/n^p</math> with a constant <math>p\in (0,\infty).</math> If we take the function <math>f(x)=1/x^p</math> then we get  
This equivalence is quite important since it allows to establish convergence of a given series using basic methods of calculus. Prominent examples include the ''Riemann series'', that is the series with general term <math>a_n=1/n^p</math> with a constant <math>p\in (0,\infty).</math> If we take the function <math>f(x)=1/x^p</math> then we get  
:<math>\int_1^\infty f(x) dx  = [ \ln x ]_1^\infty</math> if <math>p=1</math> and
:<math>\int_1^\infty f(x) \, dx  = [ \ln x ]_1^\infty</math> if <math>p=1</math> and
:<math>\int_1^\infty f(x) dx  = \left[ \frac{x^{-p+1}}{-p+1} \right]_1^\infty</math> otherwise.
:<math>\int_1^\infty f(x) \, dx  = \left[ \frac{x^{-p+1}}{-p+1} \right]_1^\infty</math> otherwise.
The limit at infinity exists if and only if <math>p>1</math> It follows that the series
The limit at infinity exists if and only if <math>p>1</math>. It follows that the series
:<math> \sum \frac{1}{n^p}</math>
:<math> \sum \frac{1}{n^p}</math>
converges if and only if <math>p\in (1,\infty).</math>
converges if and only if <math>p\in (1,\infty).</math>


====[[Jean d'Alembert|d'Alembert]] ratio test====
====[[Jean d'Alembert|D'Alembert]] ratio test====
In its simplest form it involves computation of the following limit
In its simplest form it involves computation of the following limit
:<math>L=\lim_{n\to \infty} \frac{a_{n+1}}{a_n}.</math>
:<math>L=\lim_{n\to \infty} \frac{a_{n+1}}{a_n}.</math>
If ''L'' is strictly greater than 1, then the series diverges. If ''L'' < 1 then the series is convergent. The case ''L'' = 1 gives no answer. Indeed, take for example  <math>a_n=1/n</math>. Short computation gives ''L'' = 1. Notice that we showed that this series is divergent. On the other hand, if <math>a_n=1/n^2</math>, the series was shown to converge. Still, if we compute ''L'' from the above formula, we get ''L'' = 1.
If ''L'' is strictly greater than 1, then the series diverges. If ''L'' < 1 then the series is convergent. The case ''L'' = 1 gives no answer. Indeed, take for example  <math>a_n=1/n</math>. A short computation gives ''L'' = 1. Notice that we showed that this series is divergent. On the other hand, if <math>a_n=1/n^2</math>, the series was shown to converge. Still, if we compute ''L'' from the above formula, we get ''L'' = 1.
 
====[[Augustin Louis Cauchy|Cauchy]] root test====
====[[Augustin Louis Cauchy|Cauchy]] root test====
In its popular form it is based on the computation of the following limit
In its popular form it is based on the computation of the following limit

Revision as of 20:02, 22 August 2007

In mathematics, a series is the cumulative sum of a given sequence of terms. Typically, these terms are real or complex numbers, but much more generality is possible.

For example, given the sequence of the natural numbers 1, 2, 3, ..., the series is
1,
1 + 2,
1 + 2 + 3, ...
The above writing accents the 'cumulative' nature of the series and is justified by the mathematical definition we introduce below, but more direct notation is typically used: 1 + 2 + 3 + ... .

According to the number of terms, the series may be finite or infinite. The former is relatively easy to deal with. In fact, the finite series is identified with the sum of all terms and — apart from the elementary algebra — there is no particular theory that applies. It turns out, however, that much care is required when manipulating infinite series. For example, some simple operations borrowed from the elementary algebra — such as a change of order of the terms — often lead to unexpected results. So it is sometimes tacitly understood, especially in analysis, that the term "series" refers to the infinite series. In what follows we adopt this convention and concentrate on the theory of the infinite case.

Motivations and examples

Fig. 1. Graphical representation of a geometric series.

Given a series, an obvious question arises. Does it make sense to talk about the sum of all terms? It is not always the case. Consider a simple example when the general term is constant and equal to 1, say. That is, the series reads as 1+1+1+1+... (without end). Clearly, the sum of all terms can not be finite. In mathematical language, the series diverges to infinity. [1] There is not much to say about such an object. If we want to build an interesting theory, that is to have some examples, operations and theorems, we need to deal with convergent series, that is series for which the sum of all terms is well-defined.

Actually, are there any? Maybe any series would diverge like this? Consider the following series of decreasing terms

This is a special example of what is called geometric series. The sum is finite! Instead of a rigorous proof, we present a picture (see fig.1) which gives a geometric interpretation. Each term is represented by a rectangle of the corresponding area. At any moment (given any number of rectangular "chips on the table"), the next rectangle covers exactly one half of the remaining space. Thus, the chips will never cover more than the unit square. In other words, the sum increases when more terms are added, but it does not go to the infinity, it never exceeds 1. The series converges.[2]

One may tend to think that any decreasing sequence of terms would eventually lead to a convergent series. This, however, is not the case. Consider for example the series[3]

To see that it diverges to infinity, let us make the following groups of terms (we may and do forget the one at the beginning):

1/2,
1/3+1/4,
1/5+1/6+1/7+1/8,
1/9+1/10+1/11+1/12+1/13+1/14+1/15+1/16,

etc. We will show that sum of each line exceeds 1/2. Notice that each group ends by a term of the form the smallest one in the group. Note also that group that ends by has two members, the one that ends by has four members and so on. Generally, a group that ends by has terms, each of which is greater than the smallest one at the end. So pick a group and replace each term by the smallest one. Then the new sum is easy to calculate:

members (this is the smallest term)

Obviously, the original sum is strictly greater than this and our claim follows: the sum of each group is greater than that one half. And since there are infinitely many of such groups, the sum of all terms is not finite.

These examples show that we need criteria to decide whether a given series is convergent. Before such tools can be developed we need some mathematical notation.

Formal definition

Given a sequence of elements that can be added, let

Then, the series is defined as the sequence and denoted by [4] For a single n, the sum is called the partial sum of the series.

If the sequence has a finite limit, the series is said to be convergent. In this case we define the sum of the series as

Note that the sum (i.e. the numeric value of the above limit) and the series (i.e. the sequence ) are usually denoted by the same symbol. If the above limit does not exist - or is infinite - the series is said to be divergent.

Basic convergence criteria

Series with positive terms

The simplest to investigate are the series with positive terms In this case, the cumulative sum can only increase. The only question is whether the growth has a limit. This simplifies the analysis and results in a number of basic criteria. Notice that it is not positivity that really matters. If a given series has only negative terms, a "symmetric" argument can always be applied. So the only thing we really need is the constant sign. However, for the sake of clarity, in this section we assume that the terms are simply positive.

Comparisons

There is a "family" of relative tests. The nature of a given series is inferred from what we know about another series, possibly simpler one.

If for two series and we have and if the series is known to be convergent, then the other one, is convergent as well. In other words a series "smaller" than a convergent one is always convergent. This is not difficult to justify. When we consider series of positive terms, the cumulative sum may only increase. For convergent series the growth is not unlimited -- there is an upper bound. Accordingly, the growth of the "smaller" series is limited by the same bound.

The above argument works the other way round too. If for the two series we have and the series is known to be divergent, then the other one, diverges as well. In other words a series "bigger" than a divergent one is divergent too.

A very useful version of comparison test may be expressed as follows. If for two series and it can be established that

with a finite strictly positive constant , then both series are of the same nature. In other words, if one series is known to converge, the other converges as well; if one diverges, so does the other.

Comparison with integral. This can be done if the sequence of general terms is decreasing, that is More precisely, suppose that we have for a certain decreasing function f defined on . If the integral

converges, so does the series

This equivalence is quite important since it allows to establish convergence of a given series using basic methods of calculus. Prominent examples include the Riemann series, that is the series with general term with a constant If we take the function then we get

if and
otherwise.

The limit at infinity exists if and only if . It follows that the series

converges if and only if

D'Alembert ratio test

In its simplest form it involves computation of the following limit

If L is strictly greater than 1, then the series diverges. If L < 1 then the series is convergent. The case L = 1 gives no answer. Indeed, take for example . A short computation gives L = 1. Notice that we showed that this series is divergent. On the other hand, if , the series was shown to converge. Still, if we compute L from the above formula, we get L = 1.

Cauchy root test

In its popular form it is based on the computation of the following limit

Similarly as before, L > 1 implies divergence, L < 1 means that the series is convergent. The case L= 1 does not allow to decide.

Depending on the form of the general term one may choose a tool better adapted to perform actual computation. But in some sense both above tests (d'Alembert and Cauchy) are of the same scope. That is, if one test fails to give an answer (i.e. L = 1), the other is no better.

Notes and references

  1. Adjective divergent series is used as well
  2. A simple calculation shows that the sum is actually equal to one. More details in geometric series article.
  3. It is often called harmonic series
  4. Other popular (equivalent) definition describes the series as a formal (ordered) list of terms combined by the addition operator