Subsection 1.2.2 Newton's method in the real domain
Let's take a more general look at Newton's method. The problem is to estimate a root of a real, differentiable function \(f\text{,}\) i.e. a value of \(x\) such that \(f(x)=0\text{.}\) Suppose also that we have an initial guess \(x_0\) for the actual value of the root, which we denote \(r\text{.}\) Often, the value of \(x_0\) will be based on a graph. Since \(f\) is differentiable, we can estimate \(r\) with the root of the tangent line approximation to \(f\) at \(x_0\text{;}\) let's call this point \(x_1\text{.}\) When \(x_0\) is close to \(r\text{,}\) we often find that \(x_1\) is even closer. This process is illustrated in FigureĀ 1.2.2 where it indeed appears that \(x_1\) is much closer to \(r\) than \(x_0\text{.}\)
We now find a formula for \(x_1\) in terms of the given information. Recall that \(x_1\) is the root of the tangent line approximation to \(f\) at \(x_0\text{;}\) let's call this tangent line approximation \(\ell\text{.}\) Thus,
and \(\ell(x_1)=0\text{.}\) Thus, we must simply solve
for \(x\) to get \(x_1 = x_0-f(x_0)/f'(x_0)\text{.}\)
Now, of course, \(x_1\) is again a point that is close \(r\text{.}\) Thus, we can repeat the process with \(x_1\) as the guess. The new, better estimate will then be
The process can then repeat. Thus, we can define a sequence \((x_n)\) recursively by
This process is called iteration and the sequence it generates often converges to the root \(r\text{.}\)