Skip to main content

Subsubsection 1.2.2.1 Examples

We now present several examples to illustrate the variety of things that can happen when we apply Newton's method. Throughout, we have a function \(f\) and we defined the corresponding Newton's method iteration function

\begin{equation*} N(x) = x - \frac{f(x)}{f'(x)}. \end{equation*}

We then iterate the function \(N\) from some starting point. That is, we start with an initial seed \(x_0\) and we compute the sequence \((x_n)\) by setting

\begin{equation*} x_{n} = N(x_{n-1}), \end{equation*}

for each \(n\text{.}\)

Example 1.2.3

We start with \(f(x)=x^2-2\text{.}\) Of course, \(f\) has two roots, namely \(\pm\sqrt{2}\text{.}\) Thus, we might think, of the application of Newton's method to \(f\) as a tool to find good approximations to \(\sqrt{2}\text{.}\)

First, we compute \(N\text{:}\)

\begin{align*} N(x) &= x-f(x)/f'(x) = x-\frac{x^2-2}{2x} \\ &= x-\left(\frac{x^2}{2x}-\frac{2}{2x}\right) = x-\left(\frac{x}{2}-\frac{1}{x}\right) = \frac{x}{2}+\frac{1}{x}. \end{align*}

Now, suppose that \(x_0=1\text{.}\) Then,

\begin{align*} x_1 &= N(1) = \frac{1}{2}+\frac{1}{1} = \frac{3}{2}\\ x_2 &= N(3/2) = \frac{3/2}{2}+\frac{1}{3/2} = \frac{17}{12}\\ x_3 &= N(17/12) = \frac{17/12}{2}+\frac{1}{17/12} = \frac{577}{408} \end{align*}

Note that

\begin{equation*} (577/408)^2 = 332929/166464 = 2+1/166464 \end{equation*}

so that third iterate is quite close to \(\sqrt{2}\text{.}\)

Note that we've obtained a rational approximation to \(\sqrt{2}\text{.}\) At the same time, it's clear that it would be nice to perform these computations on a computer. In that context, we might generate a decimal approximation to \(\sqrt{2}\text{.}\) Here's how this process might go in Python:

Note how quickly the process has converged to 12 digits of precision.

Of course, \(f\) has two roots. How can we choose \(x_0\) so that the process converges to \(-\sqrt{2}\text{?}\) You'll explore this question computationally in Exercise 1.3.2. It's worth noting, though, that a little geometric understanding can go a long way. Figure 1.2.4, for example, shows us that if we start with a number \(x_0\) between zero and \(\sqrt{2}\text{,}\) then \(x_1\) will be larger than \(\sqrt{2}\text{.}\) The same picture shows us that any number larger than \(\sqrt{2}\) leads to a sequence that converges to \(\sqrt{2}\text{.}\)

Figure 1.2.4 Three steps in Newton's method for \(f(x)=x^2-2\)
Example 1.2.5

We now take a look at \(f(x)=x^2+3\text{.}\) A simple look at the graph of \(f\) shows that it doesn't even hit the \(x\)-axis; thus, \(f\) has no roots. It's not at all clear what to expect from Newton's method.

A simple computation shows that the Newton's method iteration function is

\begin{equation*} N(x) = \frac{x}{2} - \frac{3}{2x}. \end{equation*}

Note that \(N(1)=-1\) and \(N(-1)=1\text{.}\) In the general context of iteration that we'll consider later, we'll say that the points \(1\) and \(-1\) lie on an orbit of period 2 under iteration of the function \(N\text{.}\)

Sequences of other points seem more complicated so we turn to the computer. Suppose that we change the initial seed \(x_0=1\) just a little tiny bit and iterate with Python.

Well, there appears to be no particular pattern in the numbers. In fact, if we generate 1000 iterates an plot those that lie within 10 units of the origin on a number line, we get Figure 1.2.6. This is our first illustration of chaotic behavior. Not just because we see points spread all throughout the interval but also because we appeared to have a recurring orbit when \(x_0=1\text{.}\) Why should the behavior be so different when we change that initial seed to \(x_0=0.99\text{?}\)

Figure 1.2.6 Chaotic behavior from Newton's method

Example 1.2.7

Newton's original example had one real root, Example 1.2.3 had two real roots and Example 1.2.5 had no real roots. Let's take a look at an example with lots real roots, namely \(f(x)=\cos(x)\text{.}\)

Generally, the closer your initial seed is to a root, the more likely the sequence starting from that seed is to converge to that root. What happens, though, if we start some place that's not so close to a root? What if we start close to the maximum - near zero? Let's investigate in code.

OK, let's pick this code apart. The inner for loop looks like so:

xi = random()/10
for i in range(8):
  xi = n(xi)

Thus, xi is set to be a random number between \(0\) and \(0.1\text{;}\) the for loop then iterates the Newton's method function in for the cosine from that initial seed 8 times. The outer for loop simply performs this experiment 10 times. After each run of the experiment, we print the resulting \(xi\) - along with the value of the cosine at that point, to check that we're indeed close to a root of the cosine. It's striking that we get 9 different results over 10 runs even though the starting points are so close to one another.

Figure 1.2.8 gives some clue as to what's going on. Recall that we can envision a Newton step for a function \(f\) from a point \(x_i\) by drawing the line that is tangent to the graph of \(f\) at the point \((x_i,f(x_i))\text{.}\) The value of \(x_i\) is then the point of intersection of this line with the \(x\)-axis. Because the slope at the maximum is zero, the value of this point of intersection is very sensitive to small changes. In fact, there are infinitely many roots of the cosine any one of which could be hit by some initial seed in this tiny interval.

Figure 1.2.8 Initial Newton steps for the cosine