Uniform Convergence and Differentiability

1. Warm-up: Uniform Convergence Does Not Preserve Differentiability

Important
There exists a sequence of \(C^1\) functions on \({\mathbb R}\) which converge uniformly on \({\mathbb R}\) to a function which is not differentiable at the origin.
Construction of a (Non-)example Sequence
\[ f(x) := \begin{cases} x - \frac{1}{2} & x > 1 \\ \frac{1}{2} x^2 & -1 \leq x \leq 1, \\ -x - \frac{1}{2} & x < -1. \end{cases} \]
Our usual technology can be used to establish that this function \(f\) is \(C^1\). Now for each natural number \(n\), let \(f_n(x) := n^{-1} f(n x)\). The figure below shows \(f_1(x)\) in red, \(f_4(x)\) in blue, and \(|x|\) in green. Notice how \(f_4(x)\) already appears to closely approximate \(|x|\) uniformly over the entire real line.
Figure. Plot of \(f_1, f_4\) and \(|x|\)

A simple computation gives that
\[{}f_n(x) - |x| = - \frac{1}{2n}{}\]
\[{}\text{ when } |x| \geq \frac{1}{n}, {}\]
\[{}|f_n(x) - |x|| = |x| - \frac{n x^2}{2} \leq \frac{1}{2n}{}\]
\[{}\text{ when } |x| < \frac{1}{n}.{}\]
So in particular, \(|f_n(x) - |x|| \leq \frac{1}{2n}\) for each \(x \in {\mathbb R}\), meaning that \(f_n\) converges uniformly to \(|x|\) as \(n \rightarrow \infty\). Since each \(f_n\) is continuous, it is no surprise that \(|x|\) is also continuous because we have already seen that uniform convergence preserves continuity. However, Each \(f_n\) is also differentiable but \(|x|\) is clearly not differentiable at \(x=0\). Thus even uniform convergence does not, by itself, preserve the property of differentiability.

2. Positive Results: Interchanging Limits and Derivatives

Theorem (Interchanging Limits and Derivatives)
Suppose that \(\{f_n\}_{n=1}^\infty\) is a sequence of real-valued differentiable functions on some open interval \(I\). If these functions converge pointwise to some limit function \(f\) as \(n \rightarrow \infty\) and the derivatives \(\{f_n'\}_{n=1}^\infty\) also converge uniformly as \(n \rightarrow \infty\), then \(f\) is differentiable on \(I\) and
\[ f'(x) = \lim_{n \rightarrow \infty} f_n'(x). \]
Informally, we say that the derivative passes through the limit:
\[ \frac{d}{dx} \lim_{n \rightarrow \infty} f_n(x) = \lim_{n \rightarrow \infty} \frac{df_n}{dx}(x). \]
Proof
Uniform convergence of \(\{f'_n\}_{n=1}^\infty\) combined with the Cauchy criterion means that the derivative of \(f_n(x) - f_m(x)\) can be made as small as desired everywhere on the interval provided that \(n\) and \(m\) are sufficiently large. For example, given any \(\epsilon > 0\), there is some threshold \(N\) such that \(n,m > N\) implies \(|f_n'(x) - f_m'(x)| < \epsilon/3\) for all \(x \in I\). Combined with the Mean Value Theorem, it must be the case that for any \(n,m > N\) and any two points \(a, x \in I\),
\[{}|(f_n(x) - f_m(x)) - (f_n(a) - f_m(a))|{}\]
\[{}= |(f'_n - f'_m)(\xi)| |x-a|{}\]
\[{}\leq \frac{\epsilon}{3} |x-a|{}\]
(where \(\xi\) is some point between \(x\) and \(a\)). This inequality is the fundamental observation that makes the proof work.

Now let's take this inequality and let \(m \rightarrow \infty\) for some fixed values of \(x\) and \(a\). It follows that
\[ |(f_n(x) - f(x)) - (f_n(a) - f(a))| \]
is the limit as \(m \rightarrow \infty\) of
\[ \big|(f_n(x) - f_m(x)) - (f_n(a) - f_m(a))\big|, \]
and consequently \(n > N\) and \(x,a \in I\) implies that
\[{}|(f_n(x) - f(x)) - (f_n(a) - f(a))|{}\]
\[{}\leq \frac{\epsilon}{3} |x-a|.{}\]
For this same value of \(N\), let's also take the inequality
\[ |f'_n(x) - f'_m(x)| < \frac{\epsilon}{3}\]
and take the limit \(m \rightarrow \infty\). Let \(g(x)\) denote the uniform limit of \(f'_n(x)\). Uniform convergence implies pointwise convergence, so \(|f'_n(x) - g(x)| \leq \frac{\epsilon}{3}\) for any \(n > N\) and any \(x \in I\).

Finally, let's fix \(n := N+1\) so that the previous inequalities are all simultaneously true. For any fixed \(a\), there is some \(\delta > 0\) such that \(0 < |x-a| < \delta\) implies
\[ \left|\frac{f_n(x) - f_n(a)}{x-a} - f'_n(a) \right| < \frac{\epsilon}{3}. \]
Putting this all together using the triangle inequality: if \(0 < |x-a| < \delta\), then
\[{}\left| \frac{f(x) - f(a)}{x-a} - g(a) \right|{}\]
\[{}\leq \Bigg| \frac{(f(x)-f_n(x)) - (f(a)-f_n(a))}{x-a}{}\]
\[{}+ \frac{f_n(x) - f_n(a)}{x-a}- g(a) \Bigg|{}\]
\[{}\leq \frac{\epsilon}{3} + \left| \frac{f_n(x) - f_n(a)}{x-a}- g(a) \right|{}\]
\[{}\leq \frac{\epsilon}{3}{}\]
\[{}+ \Big| \frac{f_n(x) - f_n(a)}{x-a} -f'_n(a){}\]
\[{}+ f'_n(a) - g(a) \Big|{}\]
\[{}\leq \frac{\epsilon}{3}{}\]
\[{}+ \left| \frac{f_n(x) - f_n(a)}{x-a} - f'_n(a) \right|{}\]
\[{}+ \left| f'_n(a) - g(a) \right|{}\]
\[{}< \frac{\epsilon}{3} + \frac{\epsilon}{3} + \frac{\epsilon}{3}.{}\]
There is a minor way that we can improve the theorem above. It turns out that we need only assume pointwise convergence of \(\{f_n\}_{n=1}^\infty\) at
a single point, since the other hypotheses imply convergence (and somewhat more) at all the other points.
Proposition
If \(\{f_n\}_{n=1}^\infty\) is a sequence of real-valued differentiable functions on some open interval \(I\). If the derivatives \(\{f_n'\}_{n=1}^\infty\) converge uniformly on \(I\) and the functions \(\{f_n\}_{n=1}^\infty\) converge pointwise at any single point of \(I\), then they must converge pointwise at every point of \(I\) and the convergence must be uniform on bounded intervals.
Proof
We can use the Cauchy criterion again. Suppose there is pointwise convergence of the \(f_n\) at \(a \in I\) and that \(x \in I\) satisfies \(|x-a| \leq R\) for some fixed \(R>0\). We know that, given \(\epsilon\), there must be some \(N\) such that \(n,m > N\) implies \(|f_n(a) - f_m(a)| < \epsilon/2\). Likewise, taking \(N\) somewhat larger if necessary, it may be assumed that \(|f'_n(x) - f'_m(x)| < \epsilon/(2R)\) for all \(x \in I\) whenever \(n,m > N\). Now for any \(x \in I\) satisfying \(|x-a| \leq R\),
\[{}|f_n(x) - f_m(x)|{}\]
\[{} \leq |(f_n(x) - f_m(x)) - (f_n(a) - f_m(a))|{}\]
\[{}+ |f_n(a) - f_m(a)|.{}\]
The Mean Value Theorem implies that the first term on the right-hand side is at most \(|(f_n'-f_m')(\xi)| |x-a|\) for some \(\xi\) between \(x\) and \(a\). In particular, it can be no bigger than \(\frac{\epsilon}{2R} R = \frac{\epsilon}{2}\). The second term on the right-hand side is also strictly smaller than \(\epsilon/2\), so
\[ |f_n(x) - f_m(x)| < \epsilon\]
for any \(x \in I\) with \(|x-a| \leq R\) provided only that \(n\) and \(m\) are larger than the threshold \(N\) identified earlier. Therefore \(\{f_n\}\) is a uniform Cauchy sequence on \(I \cap [a-R,a+R]\). In particular, it must converge uniformly (and pointwise) on that interval. Because \(R\) is arbitrary, there is uniform convergence of \(\{f_n\}_{n=1}^\infty\) on any bounded interval in \(I\) and pointwise convergence at every point of \(I\) since every point belongs to some bounded interval on which there is uniform convergence.
(Note that there is not uniform convergence everywhere all at once if \(I\) is unbounded because the interval we need to take depends on the point.)