Infinite Series, Continued

1. Infinite Series are Applied Sequences

All of the algebraic limit laws for sequences have immediate corollaries for series. For example, if \(\sum_{n=1}^\infty a_n\) and \(\sum_{n=1}^\infty b_n\) are convergent sequences and \(c\) is a constant, then
\[ \sum_{n=1}^\infty (c a_n + b_n) = c \sum_{n=1}^\infty a_n + \sum_{n=1}^\infty b_n. \]
Comparison of partial sums also shows rather directly that the initial terms in a sum do not influence its convergence, i.e., for any \(N_1\) and \(N_2\),
\[ \sum_{n=N_1}^\infty a_n \text{ converges iff } \sum_{n=N_2}^\infty a_n \text{ converges}. \]
The connection between sequences and series is most easily observed in the case of telescoping series:
Theorem (Telescoping Series)
Suppose that \(\{a_n\}_{n=1}^\infty\) is an infinite sequence. Then the series
\[ \sum_{n=1}^\infty (a_{n+1} - a_n) \]
converges if and only if the limit \(\lim_{n \rightarrow \infty} a_n\) exists. When the limit exists,
\[ \sum_{n=1}^\infty (a_{n+1} - a_n) = - a_1 + \lim_{n \rightarrow \infty} a_n. \]
Proof
Meta (Main Idea)
The key is to show by induction that the \(N\)-th partial sum \(S_N\) of the series equals \(-a_1 + a_{N+1}\). The rest is an immediate application of the properties of limits of sequences.

A convenient and sometimes important factoid is that when \(a_n\) decreases monotonically to zero, the differences \(a_{n+1} - a_n\) are all negative. This means that \(|a_{n+1} - a_n| = a_{n} - a_{n+1}\). So when \(a_{n}\) decreases monotonically to zero, the series \(\sum_{n=1}^\infty (a_{n+1}-a_n)\) not only converges, but converges absolutely.

2. p-Series

Proposition (p-Series Convergence)
The series
\[ \sum_{n=1}^\infty \frac{1}{n^p}\]
converges for \(p > 1\).
Proof
For each natural number \(n\), let \([n] := 2^{j}\) when \(2^j \leq n < 2^{j+1}\). Then \(n^{-p} \leq [n]^{-p}\) for each \(n\). We will show that
\[ \sum_{n=1}^\infty \frac{1}{[n]^p}\]
converges by showing that its partial sums are bounded (which is sufficient because all terms are nonnegative and therefore the partial sums are monotone). Consider a finite partial sum
\[ \sum_{n=1}^N \frac{1}{[n]^p}. \]
There is some positive integer \(J\) for which \(N < 2^J\) so
\[ \sum_{n=1}^N \frac{1}{[n]^p} \leq \sum_{n=1}^{2^J-1} \frac{1}{[n]^p}. \]
The easiest way to estimate the sum on the right-hand side is to count how often a term appears, since in general there are lots of duplications. For example: when \(n=1\), \([n]^{-p} = 1\). When \(n=2,3\), \([n]^{-p} = 2^{-p}\). When \(n=4,5,6,7\), \([n]^{-p} = 2^{-2p}\). In general, when \(n = 2^{j},\ldots,2^{j+1}-1\), \([n]^{-p} = 2^{-jp}\). This means that when \(j \in \{0,\ldots,J\}\), the term \(2^{-jp}\) appears in the sum exactly \(2^{j}\) times. So
\[ \sum_{n=1}^{2^J-1} \frac{1}{[n]^p} = \sum_{j=0}^J \frac{2^j}{2^{jp}} = \sum_{j=0}^J (2^{-(p-1)})^j. \]
Because \(0 < 2^{-(p-1)} < 1\), the expression we have just written is a partial sum of a convergent positive geometric series, so it's bounded above by the finite sum of the series. Thus
\[ \sum_{n=1}^N \frac{1}{n^p} \leq \sum_{j=0}^\infty (2^{-(p-1)})^j < \infty\]
for each \(N\), meaning we have successfully bounded the partial sums uniformly in \(N\).
Proposition (p-Series Divergence)
The series
\[ \sum_{n=1}^\infty \frac{1}{n^p}\]
diverges for \(p \leq 1\).
Proof
\(\frac{1}{n} \leq \frac{1}{n^p}\) when \(p \leq 1\), so the contrapositive of the Direct Comparison Test and divergence of the harmonic series implies divergence.
One way to summarize the arguments above is via the following result:
Theorem (Cauchy Condensation Test)
If \(\{a_n\}_{n=1}^\infty\) is a nonnegative decreasing sequence then
\[ \sum_{n=1}^\infty a_n \text{ converges iff } \sum_{j=0}^\infty 2^j a_{2^j} \text{ converges.}\]
Proof
Meta (Main Idea)
The idea is to apply the Direct Comparison Test twice by using the same sort of bounds we just established. Specifically,
\[ a_n \leq a_{[n]}\]
for each \(n\) because \([n] \leq n\) (recall that \([n]\) is the largest power of \(2\) which is less than or equal to \(n\)) and
\[ a_{2[n]} \leq a_n \]
for each \(n\) because \(2[n] \geq n\). Then argue as before that
\[ \sum_{n=1}^\infty a_{[n]} = \sum_{j=0}^\infty 2^j a_{2^j}\]
when the right-hand side is known to exist and that
\[ \sum_{n=1}^\infty a_{[n]} = a_1 + 2 \sum_{n=1}^\infty a_{2[n]}\]
when the right-hand side is known to exist. In both cases it can be achieved by counting the number of times any particular term repeats.

3. The Triangle Inequality

We see from the Absolute Convergence Test that whenever the series \(\sum_{n=1}^\infty |a_n|\) converges, the series \(\sum_{n=1}^\infty a_n\) is also convergent. There is an important practical consequence called the Triangle Inequality:
Theorem
Suppose that \(\sum_{n=1}^\infty |a_n|\) converges. Then \(\sum_{n=1}^\infty a_n\) does as well and
\[ \left| \sum_{n=1}^\infty a_n \right| \leq \sum_{n=1}^\infty |a_n|. \]
Proof
This is an extension of the normal
(finite) triangle inequality. If
\[ S_N := \sum_{n=1}^N a_n, \]
then the usual triangle inequality tells us that
\[ |S_N| = \left| \sum_{n=1}^N a_n \right| \leq \sum_{n=1}^N |a_n|. \]
Now because \(\sum_{n=1}^\infty |a_n|\) converges, the partial sums of this series are bounded above by the value of the sum (because its partial sums are a positive monotone sequence with limit equal to the sum of the series). Thus for each \(N\), \(|S_N| \leq \sum_{n=1}^\infty |a_n|\). As a function of \(N\), the right-hand side of this inequality is constant. In other words, all partial sums \(S_N\) of the original series must be between \(-\sum_{n=1}^\infty |a_n|\) and \(\sum_{n=1}^\infty |a_n|\) (possibly inclusive of the endpoints). But since we already know that the \(S_N\)'s converge to something, this inequality implies that the limit must also belong to the closed interval from \(-\sum_{n=1}^\infty |a_n|\) to \(\sum_{n=1}^\infty |a_n|\), which is exactly what it means to say that
\[ \left| \sum_{n=1}^\infty a_n \right| \leq \sum_{n=1}^\infty |a_n|. \]

4. Summation by Parts

We have already seen the following identity:
Proposition (Finite Summation by Parts)
Suppose that \(B_N := \sum_{n=1}^N b_n\). Then
\[ \sum_{n=1}^N a_n b_n = a_N B_N - \sum_{n=1}^{N-1} (a_{n+1} - a_n) B_n \]
Proof
Let \(B_0 := 0\) for convenience. The easiest way to prove the identity is to write the whole sum out and carefully group terms:
\[{}a_1 b_1 + \cdots + a_N b_N{}\]
\[{}= a_1 (B_1 - B_0) {}\]
\[{}+ a_2 (B_2 - B_1) + \cdots{}\]
\[{}+ a_N(B_N - B_{N-1}){}\]
\[{}= a_1 (-B_0 + B_1){}\]
\[{}+ a_2 (-B_1 + B_2) + \cdots{}\]
\[{}+ a_N( - B_{N-1} + B_N){}\]
\[{}= - a_1 B_0{}\]
\[{}+ B_1 (a_1 - a_2) + \cdots{}\]
\[{}+ B_{N-1} (a_{N-1} - a_N){}\]
\[{}+ a_N B_N{}\]
\[{}= a_N B_N - \sum_{n=1}^{N-1} (a_{n+1} - a_n) B_n.{}\]
Note why it's called summation-by-parts: the sum on the right-hand side takes a difference of \(a_n\)'s (analogous to differentiation) at the cost of a sum of \(b_n\)'s (analogous to integration).
Letting \(N \rightarrow \infty\) gives an obvious infinite version:
Corollary
Suppose that \(\{a_n\}_{n=1}^\infty\) and \(\{b_n\}_{n=1}^\infty\) are sequences of real numbers. Let \(B_N := \sum_{n=1}^N b_n\). If
\[ \lim_{N \rightarrow \infty} a_N B_N = 0, \]
then
\[{}\sum_{n=1}^\infty a_n b_n{}\]
\[{}\text{ converges iff }{}\]
\[{}\sum_{n=1}^{\infty} (a_{n+1} - a_n) B_n{}\]
\[{}\text{ converges}, {}\]
and in the case of convergence,
\[ \sum_{n=1}^\infty a_n b_n = - \sum_{n=1}^{\infty} (a_{n+1} - a_n) B_n. \]
Proof
Just use the algebraic limit laws for sequences together with the finite summation by parts identity.
An important application of this formula is the Dirichlet Convergence Test:
Theorem (Dirichlet Convergence Test)
Suppose that \(\{a_n\}_{n=1}^\infty\) is a sequence of positive terms decreasing to zero and that \(\{b_n\}_{n=1}^\infty\) is a sequence of terms such that the partial sums \(\sum_{n=1}^N b_n\) are uniformly bounded, i.e.,
\[ \left| \sum_{n=1}^N b_n \right| \leq B \text{ for all } N. \]
Then
\[ \left| \sum_{n=1}^\infty a_n b_n \right| \leq B a_1 \]
and in particular, the sum on the left-hand side is necessarily convergent.
Proof
The assumptions on \(\{a_n\}_{n=1}^\infty\) and \(\{b_n\}_{n=1}^\infty\) guarantee that \(a_N B_N \rightarrow 0\), so the infinite summation by parts identity applies. Now the series
\[ \sum_{n=1}^\infty (a_{n+1} - a_n) B_n \]
is actually absolutely convergent because \(|(a_{n+1} - a_n) B_n | \leq B|a_{n+1} - a_n|\) and the series \(\sum_{n=1}^\infty (a_{n+1} - a_n)\) is an absolutely convergent telescoping sequence because the \(a_n\)'s decrease monotonically to zero. The estimate for the magnitude of the sum comes directly from the triangle inequality.
Exercises
  1. We saw that the summation-by-parts formula is analogous to integration-by-parts when one uses the right “dictionary” of analogies (sums being analogous to integrals and differences being analogous to derivatives). Using this same reasoning, what calculus theorem would be analogous to the formula for the sum of a telescoping series?
  2. It turns out that under appropriate assumptions on the sequence \(\{a_n\}_{n=1}^\infty\),
    \[{}\sum_{n=1}^\infty a_n{}\]
    \[{}\text{ converges iff }{}\]
    \[{}\sum_{n=1}^\infty n (a_{n+1} - a_n){}\]
    \[{}\text{ converges}.{}\]
    Looking back at the infinite summation-by-parts formula, what condition on the sequence \(\{a_n\}_{n=1}^\infty\) would be needed to prove this?