Box-counting for young-uns

My daughter Audrey learned to compute box-counting dimension when she was two. If you've ever watched a two year old count, though, you know that sometimes they back up and count the same thing two or even three times. As a result, Audrey always computed an estimate $A_{\varepsilon}(E)$ that was close to the actual box-count $B_{\varepsilon}(E)$ but not exact. She certainly never missed anything but could over count by up to a factor of 3.

  • Write down a string of inequalities relating $A_{\varepsilon}(E)$ to $B_{\varepsilon}(E)$.
  • Use your inequalities to prove that Audrey still always computed the correct value of the box-counting dimension.

Comments

  • To fix notation, let's denote the box-counting dimension of $E$ by $d$.
    By assumption, [imath] B_{\epsilon}(E) \leq A_{\epsilon}(E) \leq 3B_{\epsilon}(E) [/imath], taking logarithms and dividing through by [imath]\log(1/\epsilon)[/imath] we then have

    [dmath] \frac{\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} \leq \frac{\log(A_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} \leq \frac{\log(3B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})}[/dmath]

    so that

    [dmath] \frac{\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} \leq \frac{\log(A_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} \leq \frac{\log(3)+\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})}.[/dmath]

    Now, by definition

    [dmath] \lim_{\epsilon \to \infty} \frac{\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} = d [/dmath]

    and because [imath] \log(3) [/imath] is just a constant, we also have.

    [dmath] \lim_{\epsilon \to \infty} \frac{\log(3)+\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} = d. [/dmath]

    Therefore, by the squeeze theorem

    [dmath] \frac{\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} = \frac{\log(A_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} \Rightarrow B_{\epsilon}(E) = A_{\epsilon}(E). [/dmath]

  • @theoldernoah Looks pretty good. I did edit it a bit, though.

  • I was just about to say it shouldn't diverge to infinity by definition 3.4 Box-counting dimension dim(E) of a nonempty, bounded subset E of $\mathbb{R}^n$ is defined by $dim(E)=\lim_{\epsilon \to \infty} \frac{\log(B_{\epsilon}(E))}{\log(\frac{1}{\epsilon})} = d$ provided the limit exists. Because E is compact, we know from Definition 3.3 (fractals text); $\Phi(s)$ converges 0, $\Phi(0)=m$ where m is index of the natural numbers, $\sum^{m}_{i=1}r^s=mr^s=1.$ thus there exists a unique positive number s such that $\Phi(s)=1$, this unique value of s is the similarity dimension of the IFS.

    Now to try and make a connection to the strong open set and lemma 3.7, the real kicker is if and when we don't have sectional overlapping, this can be guaranteed by bounds checking. the subset U in $R^n$ such that $U \cap E \neq \emptyset $ , and $\overset{\cdot}{\cup}_{i=0}^{m}f_i(U) \subset U.$ the example from the text when given the generalized Cantor set it satisfies the SoSc (strong open set) $r_1+r_2 \leq 1$, in contrast if $r_1+r_2 > 1$ then we do not have a strong openset.

  • Just want to make sure I am following everything, does this mean that we can be off by any scalar that over approximates the box- counting dimension and still calculate the correct box- counting dimension even if we count everything say, 1000 times because the logarithm of 1000 is still a constant?

  • @maththemagician Yes! We can be off by any multiplicative factor and still get the correct box-counting dimension. An important consequence is that we can often make convenient simplifications when computing things like $N_{\varepsilon}(E)$ or $P_{\varepsilon}(E)$ or $C_{\varepsilon}(E)$, yet we still get the correct dimension.

Sign In or Register to comment.