Sequences of Functions

We're going to wrap this semester up in Chapter 6 on Sequences of Functions. Today, I'll briefly mention the motivations discussed in section 6.1 and then move sections 6.2 and 6.3 on uniform convergence. This material is also covered in these YouTube videos

The power of power series

Our text starts off with some pretty incredible applications of power series culminating with Euler's spectacular results that $$ \sum_{n=1}^{\infty} \frac{1}{n^2} = \frac{\pi^2}{6}. $$

Of course incredible means difficult or impossible to believe.

Leibniz's formula

Let's take a quick look at the derivation of Leibniz's formula: $$\begin{aligned} \sum_{n=0}^{\infty} \frac{(-1)^n}{2n+1} &= 1-\frac{1}{3}+\frac{1}{5}-\frac{1}{7}+\cdots \\ &= \frac{\pi}{4}. \end{aligned}$$

Leibniz's formula (cont)

Leibniz's starting point is the geometric series formula expressed in terms of the variable $x$:

$$ \frac{1}{1-x} = \sum_{n=0}^{\infty} x^n $$

Substituting $x \to -x^2$, we get

$$ \frac{1}{1-(-x^2)} = \sum_{n=0}^{\infty} (-x^2)^n. $$

or

$$ \frac{1}{1+x^2} = \sum_{n=0}^{\infty} (-1)^n x^{2n}. $$

Leibniz's formula (cont)

Integrating both sides, we get $$ \arctan(x) = {\color{lightgray} c} + \sum_{n=0}^{\infty} \frac{(-1)^n}{2n+1}x^{2n+1}. $$ You can plug $x=0$ into both sides to find that the constant of integration must be $c=0$.

Leibniz's formula (finale)

Finally, setting $x=1$ and using the fact that $\arctan(1)=\pi/4$, we get $$ \sum_{n=0}^{\infty} \frac{(-1)^n}{2n+1} = \frac{\pi}{4}. $$

Questions

This is a nice derivation of Leibniz's formula; there are serious questions that need to be addressed to make it a full-fledged proof.

  • Is term by term integration of an infinite series a legal operation?
  • Is it legal to plug $x=1$ into the integrated formula? This seems particularly suspect in light of the fact that $x=1$ is not in the domain of the series that we integrated: $$ \frac{1}{1+x^2} = \sum_{n=0}^{\infty} (-1)^n x^{2n}. $$

Term by term

It might be worth pointing out that term by term differentiation is not always legal. Consider, for example, $$ f(x) = \sum_{n=1}^{\infty} \frac{\sin(nx)}{n} = \begin{cases} -\frac{\pi+x}{2} & -\pi<x<0 \\ 0 & x=0 \\ \frac{\pi-x}{2} & 0<x<\pi.\end{cases} $$ Assuming this formula is correct, note that the piecewise defined function is differentiable at most points of its domain, yet the derivative of the series $$\sum_{n=1}^{\infty} \cos(nx)$$ is divergent for most $x$. 😕

Evidence for the formula

The previous series can be derived using Fourier's formula. For now, we'll have to accept some graphical evidence:

Objectives

So, our key objective from here is

  • To what extent are the operations of algebra and calculus preserved by the limit and
  • When those operations are not preserved by the limit, what additional assumptions might we make to ensure that the operations are preserved by the limit?

Pointwise convergence

Def 6.2.1: For each $n \in N$, let $f_n$ be a function defined on a set $A \subset \mathbb R$. The sequence $(f_n)$ of functions converges pointwise on $A$ to a function $f$ if, for all $x \in A$, the sequence of real numbers $f_n(x)$ converges to $f(x)$.

In this case, we write $f_n \to f$, $\lim f_n = f$, or $$\lim_{n\to\infty} f_n(x) = f(x).$$

Example 1

It's probably not too difficult to imagine that $$\lim_{n\to\infty} \left(\sin(x) + x/n\right) = \sin(x).$$

Example 2

Perhaps a little trickier is $$\lim_{n\to\infty} x^{1/(2n-1)} = \begin{cases} 1 & x>0 \\ 0 & x=0 \\ -1 & x<0. \end{cases}.$$

Example 2 comment

That second example illustrates that the pointwise limit of continuous functions need not be continuous.

Example 3

Pushing the last example a little further, we get $$\lim_{n\to\infty} x^{1+1/(2n-1)} = x\lim_{n\to\infty} x^{1/(2n-1)} = |x|.$$

Example 3 comment

The third example illustrates that the pointwise limit of differentiable functions need not be differentiable, even when the limit is continuous.

Uniform convergence

Def 6.2.3:. Let $(f_n)$ be a sequence of functions defined on a set $A \subset \mathbb R$. Then, $(f_n)$ converges uniformly on $A$ to a limit function $f$ defined on $A$ if, for every $\varepsilon > 0$, there exists an $N \in \mathbb N$ such that $$|f_n(x)−f(x)|<\varepsilon \text{ whenever } n\geq N \text{ and }x\in A.$$

Note that the choice of $N$ is does not depend on $x$. Put another way, the choice of $N$ is uniform throughout $A$.

Pointwise comparison

To emphasize the distinction between pointwise and uniform convergence, it might help to restate the definition of pointwise convergence using the $\varepsilon$-$\delta$ language, rather simply referring to the limit.

Def 6.2.1(b): Let $(f_n)$ be a sequence of functions defined on a set $A \subset \mathbb R$. Then, $(f_n)$ converges pointwise to a limit function $f$ defined on $A$ if, for every $\varepsilon > 0$ and $x\in A$, there exists an $N \in \mathbb N$ such that $$|f_n(x)−f(x)|<\varepsilon \text{ whenever } n\geq N.$$

Note that the location of the $x$ has moved.

The epsilon tube

One way to think about this via the so-called epsilon tube. A function $g$ is uniformly within $\varepsilon$ of the function $f$ if $$|f(x)-g(x)| < \varepsilon.$$ Which we could write as $$-\varepsilon < g(x) - f(x) < \varepsilon$$ or $$f(x)-\varepsilon < g(x) < f(x) + \varepsilon.$$

Visualization

The previous inequality implies uniform upper and lower bounds on the function $g$, if it is to be uniformly within $\varepsilon$ of $f$.

A non-example

For each $n\in\mathbb N$ define $f_n:[0,1]\to[0,1]$ by $f_n(x) = x^n$. It's not hard to see that the pointwise limit satisfies $$ \lim_{n\to\infty} f_n(x) = \begin{cases} 0 & 0\leq x < 1 \\ 1 & x=1. \end{cases} $$ The cannot be uniform, however, and the limit is not continuous even though each individual function in the sequence is continuous.

Non-example graph

I guess the thing to recognize is that, no matter how large $n$ is, the graph has got to swoop up at some point and cross the vertical line at $y=\varepsilon$ and exit the $\varepsilon$ tube at that point.

Applications of uniform convergence

There are two main applications of uniform convergence. The first is that the uniform limit of continuous functions is continuous. The second is similar but applied to differentiation.

Continuous limit theorem

The key fact on uniform convergence is the following:

Thm 6.2.6: Let $(f_n)$ be a sequence of functions defined on $A \subset \mathbb R$ that converges uniformly on $A$ to a function $f$. If each $f_n$ is continuous at $c \in A$, then $f$ is continuous at $c$.

Proof of CLT

Fix $c\in A$ and let $\varepsilon > 0$. Choose $N\in\mathbb N$ large enough so that $$|f_N(x)-f(x)| < \varepsilon/3$$ for all $x\in A$. Since $f_N$ is continous at $c$, we can choose $\delta>0$ such that $$|f_N(x)-f_N(c)| < \varepsilon/3$$ whenever $|x-c|<\delta$.

Proof of CLT (cont)

Using the values of $\varepsilon$, $N$, and $\delta$ on the previous slide, we have $$\begin{aligned} |f(x)-f(c)| &= |f(x)-f_N(x)+f_N(x)-f_N(c)+f_N(c)-f(c)| \\ &\leq |f(x)-f_N(x)|+|f_N(x)-f_N(c)|+|f_N(c)-f(c)| \\ &< \frac{\varepsilon}{3} + \frac{\varepsilon}{3} + \frac{\varepsilon}{3} = \varepsilon. \end{aligned}$$ Thus, $f$ is continuous at $c$.

Differentiable limit theorem

Thm 6.3.1: Let $f_n\to f$ pointwise on the closed interval $[a,b]$ and assume that each $f_n$ is differentiable. If $(f_n')$ converges uniformly on $[a,b]$ to a function $g$, then $f$ is differentiable on $[a,b]$ and $f'=g$.

The proof of this theorem is somewhat more complicated and is your text. We will not prove it here but we will apply this theorem to power series and ultimately be able to differentiate them term by term!