Power series

Well, it's been a bit rough this semester and we certainly haven't covered all the material that I might have hoped. Nonetheless, we are arriving at a satisfying point as we apply theory that we've learned to explain why it is that power series behave as nicely as they do.

This is all taken from sections 6.4 and 6.5 of our text. As far as I can tell, though, there's no YouTube videos over these sections.

Overview

A power series has the form $$ \sum_{n=0}^{\infty} a_n x^n = a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n + \cdots $$ I guess it kinda looks like an infinitely long polynomial. We certainly treat it that way when we do things like compute the derivative or integral term by term.

Simpler objectives

Our first objective is to prove the most basic properties of power series:

  • They sum to a continuous function,
  • they are differentiable and integrable, and
  • you can compute the derivative and integral termwise.

All that is true on the interior of the domain.

A trickier objective

A trickier issue issue is continuity on the full domain. For example, $$ \sum_{n=1}^{\infty} (-1)^n \frac{x^n}{n} $$ is continuous on its full domain $(-1,1]$. It's much easier to prove continuity on just $(-1,1)$, though.

Somehow, that right hand endpoint is super hard to get. It's quite important, though, in certain applications.

Application

Consider our example from last time. After plugging $-x^2$ in for $x$ into the geometric series formula, we came to $$ \frac{1}{1+x^2} = \sum_{n=0}^{\infty} (-1)^n x^{2n}. $$ After integrating we got $$ \arctan(x) = \sum_{n=0}^{\infty} \frac{(-1)^n}{2n+1}x^{2n+1}. $$ Then, we plugged in $x=1$ to derive Leibniz's formula, even though $x=1$ is not in the domain of the original function. The fact that the sum of the series is continuous at $x=1$ is essential to complete the argument.

Series of functions

First, though, we're going to work at a bit higher level of generality. Note that each term $a_nx^n$ of a power series is itself a function. In fact, we might have occasion to add infinitely many functions of many kinds. Thus, it makes sense to work as generally as possible.

Series of functions

Def 6.4.1: For each $n \in \mathbb N$, let $f_n$ and $f$ be functions defined on a set $A \subset \mathbb R$. The infinite series $$ \sum_{n=1}^{\infty} f_n(x) = f_1(x) + f_2(x) + f_3(x) + \cdots $$ converges pointwise on $A$ to $f(x)$ if the sequence $s_k(x)$ of partial sums defined by $$s_k(x) = f_1(x) + f_2(x) + \cdots + f_k(x)$$ converges pointwise to $f(x)$. The series converges uniformly on $A$ to $f$ if the sequence $s_k(x)$ converges uniformly on $A$ to $f(x)$. 

In either case, we write $f = \sum_{n=1}^{\infty} f_n$ or $f(x) = \sum_{n=1}^{\infty} f_n(x)$, always being explicit about the type of convergence involved.

Term-by-term Continuity Theorem

Thm 6.4.2: Let $f_n$ be continuous functions defined on a set $A \subset \mathbb R$, and assume $$ \sum_{n=1}^{\infty} f_n(x) $$ converges uniformly on $A$ to a function $f$. Then, $f$ is continuous on $A$.

The proof is an immediate consequence of the continuous limit theorem (Thm 6.2.6) to the partial sums $$s_k = f_1+f_2+\cdots+f_k.$$

And, the proof of the following is an immediate consequence of the differentiable limit theorem (Thm 6.3.3) to the partial sums $$s_k = f_1+f_2+\cdots+f_k.$$

Term-by-term Differentiability Theorem

Thm 6.4.3: Let $f_n$ be differentiable functions defined on an interval $A$, and assume $$\sum_{n=1}^{\infty}f_n(x)$$ converges uniformly to a limit $g(x)$ on $A$. If there exists a point $x_0 \in [a, b]$ where $$\sum_{n=1}^{\infty}f_n(x_0)$$ converges, then the series converges uniformly to a differentiable function $f(x)$ satisfying $f(x) = g(x)$ on $A$. In other words, $$ f(x) = \sum_{n=1}^{\infty}f_n(x) \text{ and } f'(x) = \sum_{n=1}^{\infty}f'_n(x). $$

Cauchy Criterion for Uniform Convergence of Series

Thm 6.4.4: A series $\sum_{n=1}^{\infty} f_n$ converges uniformly on $A \subset \mathbb R$ if and only if for every $\varepsilon > 0$ there exists an $N \in \mathbb N$ such that $$|f_{m+1}(x) + f_{m+2}(x) + \cdots + f_n(x)| <\varepsilon$$ whenever $n > m \geq N$ and $x\in A$.

This follows from Thm 6.2.5, the corresponding theorem for sequences since, if $s_n$ denotes the $k^{\text{th}}$ partial sum, then $$\begin{aligned} s_n(x) - s_m(x) &= f_1(x) + f_2(x) + \cdots + f_m(x) + f_{m+1}(x) + \cdots + f_n(x) \\ &- (f_1(x) + f_2(x) + \cdots + f_m(x)) = f_{m+1}(x) + \cdots + f_n(x). \end{aligned}$$

Weierstrass $M$-test

Cor 6.4.5: For each $n \in \mathbb N$, let $f_n$ be a function defined on a set $A \subset \mathbb R$, and let $M_n > 0$ be a real number satisfying $$ |f_n(x)| \leq M_n \text{ for all } x \in A. $$ If $\sum M_n$ converges, then $\sum f_n(x)$ converges uniformly on $A$.

This follows from the Cauchy Criterion since $$\begin{aligned} |f_{m+1}(x) + f_{m+2}(x) + \cdots + f_n(x)| &\leq |f_{m+1}(x)| + |f_{m+2}(x)| + \cdots + |f_n(x)| \\ &\leq M_{m+1} + M_{m+1} + \cdots + M_n. \end{aligned}$$ Furthermore, we can make that last sum as small as we like by choosing $N$ (and, therefore, $m$) big enough.

Application to power series

That last column of slides might have felt like a bit of a slog but it's about to pay off in a major way!

The first theorem we look at will establish that the domain of convergence is an interval centered at zero, though possibly very short $\{0\}$ or very long $(-\infty,\infty)$.

Domain of convergence

Thm 6.5.1: If a power series $\sum a_n x^n$ converges at some point $x_0 \in \mathbb R$, then it converges absolutely for any $x$ satisfying $|x| < |x_0|$.

Proof: If $\sum a_n x^n_0$ converges, then the sequence of terms $(a_n x_0^n)$ is bounded, since it converges to $0$. Let $M > 0$ satisfy $|a_nx_0^n| \leq M$ for all $n \in \mathbb N$. If $x\in\mathbb R$ satisfies $x<|x_0|$, then $$ |a_nx_n| = |a_nx_0^n|\left|\frac{x}{x_0}\right|^n \leq M \left|\frac{x}{x_0}\right|^n. $$ That last term is the general term of convergent geometric series. Thus, our series (formed by the first term) converges absolutely by the comparison test. $\Box$

Uniform convergence

Thm 6.5.2: If a power series $\sum a_n x^n$ converges absolutely at a point $x_0$, then it converges uniformly on the closed interval $[−c, c]$, where $c = |x_0|$.

The proof is an immediate consequence of the Weierstrass $M$-test taking $$ M_n = |a_n x_0^n| $$

An immediate corollary is the fact that the sum is continuous on $[-c,c]$.

Examining the domains of convergence

At this point, we know that the domain of convergence of a power series $\sum a_n x^n$ looks like one of the following:

$$\{0\}, \, \mathbb R, \, [-R,R], \, [-R,R), \, (-R,R], \text{ or } (-R,R).$$

Which class the series falls into depends on the asymptotic behavior of $a_n$. The first (degenerate) case occurs when $a_n=n!$ and the second when $a_n = \frac{1}{n!}$.

In the other cases, $R$ is a positive number called the radius of convergence. I imagine you could find the domain of convergence of $$ \sum a_n x^n. $$

Extending the domains of continuity

Suppose that $R>0$ and that the power series $\sum a_n x^n$ converges on the interval $(-R,R)$. Let's show that sum is continuous on $(-R,R)$.

To do so, let's assume that $x\in (-R,R)$. We then choose $a$, $b$, and $c$ such that $$ |x| < a < b < c < R.$$ By assumption, $c$ is in the domain of convergence. By theorem 6.5.1, the series therefore converges absolutely at $b$. The convergence is therefore uniform on $[-a,a]$ by theorem 6.5.2. Thus, the sum is continuous on $[-a,a]$ and, in particular, at $x$.

Differentiation

Note that termwise differentiation of a power series transforms $$ \sum_{n=0}^{\infty} a_n x^n \text{ into } \sum_{n=1}^{\infty} n a_n x^{n-1}. $$ Thus, we've produced a new power series with the same radius of convergence, though the behavior at the end points may differ. All the results we've developed are therefore applicable, the differentiated series is uniformly convergent on compact subsets of the domain and termwise differentiation is a legal operation.

W00t!

Anti-differentiation

Termwise anti-differentiation of a power series transforms $$ \sum_{n=0}^{\infty} a_n x^n \text{ into } \sum_{n=0}^{\infty} \frac{a_n}{n+1} x^{n+1}. $$ We've again produced a new power series with the same radius of convergence and the series is even more likely to converge at the end points. The derivative of this series (i.e., the original series) is known to converge uniformly on compact subsets of it's domain so that we've produced an anti-derivative of that original series.

W00t, W00t, W00t!!!

Question on the endpoints

What if the domain of convergence at least one of the endpoints so that it looks like one of $$[-R,R], \, [-R,R), \text{ or } (-R,R]?$$ Note that, in all cases, the series converges on $(-R,R)$ so that the previous slide implies continuity in the interior.

Major question: Is the sum continuous from the interior? That is, what are $$ \lim_{x\to R^-} \sum_{n=0}^{\infty} a_n x^n \text{ and/or } \lim_{x\to -R^+} \sum_{n=0}^{\infty} a_n x^n? $$

Answer on the endpoints

YES

This rather technical proof of this result, along with several preliminary lemmas is presented as Theorem 6.5.4 of our text and known as Abel's theorem.

Not to be confused with Abel's impossibility theorem, which states that the the roots of the general quintic polynomial cannot be solved using only algebraic expressions.

Abel's theorem

Here's the precise statement of Abel's theorem:

Thm 6.5.4: Let $g(x) = \sum a_n x^n$ be a power series that converges at the point $x = R > 0$. Then the series converges uniformly on the interval $[0, R]$. A similar result holds if the series converges at $x = −R$.

Abel summability

Evidently, it's possible for $$ \lim_{x\to1^-} \sum_{n=0}^{\infty} a_n x^n $$ to exist, even if $ \sum_{n=0}^{\infty} a_n $ does not exist.

When this happens, the series $\sum_{n=0}^{\infty} a_n$ is said to be Abel summable.

In your last assignment of the semester, you're asked to show that $$ \sum_{n=0}^{\infty} (-1)^n $$ is Abel summable and to find the value.