Section 8.4 Power Series
Notice that we formulated a lot of Section 8.3 for a general normed space. So we can apply it to the normed space \(\mathcal{B}([a,b])\) of bounded real functions on an interval, equipped with the sup norm.
Why on earth would we do that?!, I can hear you asking. Consider what we've seen so far:
Any continuous function can be uniformly approximated by a polynomial.
Any \(\mathcal{C}^{k+1}\) function can be approximated by a polynomial of degree \(k\text{,}\) with remainder controlled by the \(k+1\) derivative.
Polynomials sure are nice.
And, of course, polynomials are finite sums of monomials. So we might hope to understand the limits of such sums, i.e., series whose terms are monomials.
Definition 8.4.1.
The power series with coefficients \((\alpha_k)_{k\in\mathbb{N}}\) and center \(x_0\) is the series
Notice that, for any interval \([a,b]\text{,}\) the power series is a series in \(\mathcal{B}([a,b])\text{.}\)
Theorem 8.4.2. The Theorem about Power Series.
Given a power series \(\displaystyle \sum_{k=0}^\infty \alpha_k(x-x_0)^k\text{,}\) define \(\alpha=\limsup \sqrt[k]{\lvert \alpha_k\rvert}\text{.}\) The power series
converges uniformly on every closed interval \([a,b]\subset \left(x_0-\frac{1}{\alpha},x_0+\frac{1}{\alpha}\right)\text{,}\)
converges pointwise on \(\left(x_0-\frac{1}{\alpha},x_0+\frac{1}{\alpha}\right)\text{,}\) and
diverges on \(\left(-\infty,x_0-\frac{1}{\alpha}\right)\cup\left(x_0+\frac{1}{\alpha},\infty\right)\text{.}\)
Proof.
This is just Theorem 8.3.8. You work out the details.
Definition 8.4.3.
The number \(\frac{1}{\alpha}\) in Theorem 8.4.2 is called the radius of convergence of the series \(\displaystyle \sum_{k=0}^\infty \alpha_k(x-x_0)^k\text{.}\)
Definition 8.4.4.
The derived series of the power series \(\displaystyle\sum_{k=0}^\infty a_k(x-x_0)^k\) is the series \(\displaystyle \sum_{k=0}^\infty (k+1)a_{k+1}(x-x_0)^k\text{.}\)
Checkpoint 8.4.5.
For the series \(1-x+\frac{1}{2}x^2-\frac{1}{3}x^3+\cdots\text{,}\) compute the derived series.
Why do we use the name "derived series"?
Proposition 8.4.6.
A series and its derived series have the same radius of convergence.
Proof.
Consider the limit
Lemma 8.4.7.
\(\sqrt[k]{k+1}\to 1\)
Proof.
We'll use the so-called log trick. Technically speaking, we haven't defined logarithms yet; let's fix that:
It's obvious that this function is continuous (that's a fact we proved in class); and a quick argument shows that it satisfies the two nice properties:
\(\displaystyle \log(x\cdot y)=\log x + \log y\)
\(\displaystyle \log(x^p)=p\log x\)
The second property implies that any \(\epsilon\gt 0\text{,}\) there is \(K\) so that \(x\gt K\) guarantees \(\log x\lt \epsilon x\text{.}\)
The log trick is to compute a limit by taking its log, exploiting these algebraic properties, then delogging after the limit is obtained. Here, that works out as:
by the observation that \(\log x\) grows much more slowly than \(x\text{.}\) Therefore, \(\sqrt[k]{k+1}\) tends to a number \(L\) with \(\log L=0\text{.}\) The only such number is \(L=1\text{.}\)
Now this means that, given any \(\epsilon\gt 0\text{,}\) for large enough \(k\text{,}\) \(\sqrt[k]{\lvert a_{k+1}}-\epsilon\lt \sqrt[k]{\lvert (k+1)a_{k+1}\rvert}\lt \sqrt[k]{\lvert a_{k+1}}+\epsilon\text{,}\) so \(\limsup \sqrt[k]{\lvert (k+1)a_{k+1}\rvert}= \limsup \sqrt[k]{\lvert a_{k+1}\rvert}\text{.}\)
Now we need to compute \(\limsup \sqrt[k]{\lvert a_{k+1}\rvert}\text{.}\) First, observe that we're very nearly there, because:
Proposition 8.4.6 ends up being quite remarkable. To see this, let's use \(\displaystyle S_N(x)=\displaystyle\sum_{k=0}^N a_k(x-x_0)^k\) to denote the sequence of functions whose limit is \(S(x)=\displaystyle\sum_{k=0}^\infty a_k(x-x_0)^k\text{.}\) Proposition 8.4.6 says that on any closed interval where \(S_N\to S\text{,}\) in fact we have that the sequence of derivatives \(S_N'\) converges uniformly, say to something we'll call \(D\text{.}\) But then \(S'=D\text{;}\) in particular, \(S\) is differentiable!
It gets even better, though: we could consider the sequence \(S_N''\text{,}\) which again will have the same radius of convergence, hence converge to some \(D_2\) with \(S''=D_2\text{.}\) So \(S\) is twice-differentiable. And so on. Let's record this in the form of a theorem.
Theorem 8.4.8. The Other Theorem About Power Series.
Given a power series \(\displaystyle\sum_{k=0}^\infty a_k(x-x_0)^k\) with radius of convergence \(R\text{,}\) on any closed interval \([a,b]\subseteq (x_0-R,x_0+R)\text{,}\) the series converges to a \(\mathcal{C}^\infty\) limit.
Moreover, derivatives of power series can be computed termwise.
An easier statement is this:
Proposition 8.4.9.
Within its radius of convergence, a powerseries can be integrated termwise.