Skip to main content

Section 4.1 A general escape time algorithm

Although we've only looked at the quadratic family, it seems apparent (for polynomials, at least), that the orbit of any initial seed that is large enough will diverge to \(\infty\text{.}\) This is not at all hard to prove.

Suppose that \(z_0\in\mathbb C\) satisfies \(|z_0| \gt 2|a_n|\) and

\begin{equation*} |z_0| \gt 2\frac{|a_{n-1}|+\cdots+|a_0|}{|a_n|}. \end{equation*}


\begin{align*} |z_1| = |f(z_0)| &= |a_nz_0^n+a_{n-1}z_0^{n-1}+\cdots+a_0|\\ &= |a_n z_0^n|\left|1+\frac{a_{n-1}}{a_nz_0}+\cdots+\frac{a_0}{a_nz_0^n}\right|. \end{align*}

Now, our condition on \(z_0\) implies that

\begin{align*} S & \equiv \left|\frac{a_{n-1}}{a_nz_0}+\cdots+\frac{a_0}{a_nz_0^n} \right| \leq \left|\frac{a_{n-1}}{a_nz_0}\right|+\cdots+\left|\frac{a_0}{a_nz_0^n}\right|\\ & \leq \left|\frac{a_{n-1}}{a_nz_0}\right|+\cdots+\left|\frac{a_0}{a_nz_0}\right| = \frac{|a_{n-1}| + \cdots + |a_{0}|}{|a_nz_0|} \lt \frac{1}{2}. \end{align*}

Thus, the triangle inequality yields

\begin{equation*} |1|=|1+S+(-S)|\leq|1+S|+|-S|= |1+S|+|S| \end{equation*}

so that \(|1+S|\geq 1-|S|\geq 1/2\text{.}\) Applying this to our original inequality for \(z_1\text{,}\) we obtain

\begin{align*} |z_1| &\gt |a_nz_0^n| \frac{1}{2} \geq |a_n||z_0||z_0|\frac{1}{2}\\ & \gt 2|z_0|\frac{1}{2} = z_0. \end{align*}

Now, since \(|z_1| \gt |z_0|\) we can write \(|z_1| = \lambda |z_0|\) for some \(\lambda \gt 1\text{.}\) Furthermore, \(z_1\) now also satisfies \(z_1 \gt R\) so that, by induction, \(|z_n| \geq \lambda^n |z_0|\text{.}\) As a result, \(z_n\to\infty\) as \(n\to\infty\text{.}\)

It's easy to strengthen this result. Suppose we iterate from an initial seed \(z_0\) that may be less than the escape radius. If any iterate ever does exceed the escape radius in absolute value, then the orbit will escape. This yields the following corollary.

Theorem Theorem 4.1.1 ensures that case 2 happens for all polynomials. It's also easy to see that there are lots of points whose orbits stay bounded. Any fixed point, yields an orbit that stays bounded. Of course, there are always fixed points since the equation \(f(x)=x\) yields a polynomial that always has at least one complex solution. For that matter, periodic orbits are also bounded and \(f^n(x)=x\) yields lots of those when \(n\) is large.

Definition 4.1.3

Let \(p:\mathbb C \to \mathbb C\) be a polynomial.

  1. The filled Julia set of \(p\) is the set of all complex numbers whose orbits remain bounded under iteration of \(p\text{.}\)
  2. The Julia set of \(p\) is the boundary of the filled Julia set.
  3. The complement of the filled Julia set or, equivalently, the set of all points that diverge to \(\infty\) is called the basin of attraction of \(\infty\).

We will denote the filled Julia set of \(p\) by \(K_p\) and the Julia set of \(p\) by \(J_p\text{.}\)

We should point out that there are other characterizations of the Julia set that work more generally than the definition we give here, simply because other classes of functions might not have an “escape radius”. When we redefine the Julia set in those other contexts, it will also be our responsibility to prove that it is consistent with this first definition.

There are a few relatively easy observations that we can make about these sets. Suppose that for some initial seed \(z_0\) and integer \(n\text{,}\) we have \(|p^n(z_0)| \gt R\text{.}\) Then, by continuity, the same will be true for points sufficiently close to \(z_0\text{.}\) Thus, the basin of attraction of \(p\) is an open set. As a consequence, the filled Julia set \(K_p\) is a closed set. The boundary of a set is, by definition, its closure minus its interior. Thus the Julia set itself is also closed and non-empty. In addition, any neighborhood of \(J_p\) contains points whose orbits diverge to \(\infty\) and points that stay bounded. Thus, the dynamics of \(p\) near \(J_p\) display sensitive dependence on initial conditions.

Finally, we note that corollary Corollary 4.1.2 yields a nice algorithm to generate a filled Julia set.

In spite of the simplicity of this section, many of the ideas here are similar to more complicated situations and we will see several variations of the escape time algorithm later. The remainder of our examples rely on application of this algorithm