Skip to main content

Unit 6.2 Orders of growth at infinity

Often in mathematical modeling, one hears statements such as "This model produces a much smaller growth rate than the other model, as time gets large." This statement sounds vague: how much is "much smaller" and what are "large times"? In this section we will give a precise meaning to statements such as this one.
Why are we spending our time making a science out of vague statements? Answer:
  1. people really think this way, and it clarifies your thinking to make these thoughts precise;
  2. a lot of theorems can be stated with these as hypotheses;
  3. knowing the science of orders of growth helps to fulfill the Number Sense mandate because you can easily fit an unfamiliar function into the right place in the hierarchy of more familiar functions.
We focus on two notions in particular: when one function is much bigger/smaller/closer than another, and when two functions are asymptotically equal.

Comparisons at infinity.

Mostly we will be comparing functions of \(x\) as \(x \to \infty\text{.}\) Let \(f\) and \(g\) be positive functions.

Definition 6.8.

  1. We say the function \(f\) is asymptotic to the function \(g\text{,}\) short for "asymptotically equal to", if
    \begin{equation*} \displaystyle\lim_{x \to \infty} \frac{f(x)}{g(x)} = 1 \, . \end{equation*}
    This is denoted \(f \sim g\text{.}\)
  2. The function \(f\) is said to be much smaller than \(g\text{,}\) or to grow much more slowly if
    \begin{equation*} \displaystyle\lim_{x \to \infty} \frac{f(x)}{g(x)} = 0 \, . \end{equation*}
    This is denoted \(f \ll g\text{.}\) Typically this notation is used only when \(g\) is positive.

Example 6.9.

Is it true that \(x^2 + 3x\) is asymptotically equal to \(x^2\text{?}\) Intuitively it should be true because \(3x\) is a lot smaller than \(x^2\) when \(x\) is large (in fact, it is much smaller) so adding it to \(x^2\) should be negligible. We check that
\begin{equation*} \displaystyle\lim_{x \to \infty} \frac{x^2 + 3x}{x^2} = \lim_{x \to \infty} 1 + \frac{3}{x} = 1 \, , \end{equation*}
therefore indeed \(x^2 + 3x \sim x^2\text{.}\)

Checkpoint 107.

Example 6.10.

Let’s compare two powers, say \(x^3\) and \(x^{3.1}\text{.}\) Are they asymptotically equivalent or does one grow much faster? Taking the limit at infinity we see that \(\displaystyle\lim_{x \to \infty} x^3 / x^{3.1} = \lim_{x \to \infty} x^{-0.1} = 0\text{.}\) Therefore, \(x^3 \ll x^{3.1}\text{.}\) This is shown on the left side of Figure 6.11.
Figure 6.11. Comparing growth rates. The green graph is \(y=ax^r\text{;}\) the blue is \(y=bx^s\text{.}\) Be sure to zoom in and out.
Try using the tool in Figure 6.11 to compare \(10 x^{1.3}\) with \(0.1 x^{1.5}\text{?}\) It looks at first like \(10 x^{1.3}\) remains much greater than \(0.1 x^{1.5}\text{.}\) Doing the math gives
\begin{equation*} \displaystyle\lim_{x \to \infty} \frac{10 x^{1.3}}{0.1 x^{1.5}} = \lim_{x \to \infty} 100 x^{-0.2} = 0 \, . \end{equation*}
Therefore, again, \(10 x^{1.3} \ll 0.1 x^{1.5}\text{.}\) Whether or not you care what happens beyond \(10^{41}\) depends on the application, but the math is pretty clear: if \(a < b\text{,}\) then \(K x^a \ll L x^b\) for any positive constants \(K\) and \(L\text{.}\)

Discussion.

This is a general rule: the function \(g(x) + h(x)\) will be asymptotic to \(g(x)\) exactly when \(h(x) \ll g(x)\text{.}\) Why? Because \((g(x) + h(x)) / g(x)\) and \(h(x)/g(x)\) differ by precisely 1. It follows that if \(g(x) + h(x) \sim g(x)\) then
\begin{equation*} 1 = \lim_{x \to \infty} \frac{g(x) + h(x)}{g(x)} = \lim_{x \to \infty} 1 + \frac{h(x)}{g(x)} = 1 + \lim_{x \to \infty} \frac{h(x)}{g(x)} \qquad \mbox{ hence } \qquad \lim_{x \to \infty} \frac{h(x)}{g(x)} = 0 \; , \hspace{0.5in} \end{equation*}
or in other words, \(h \ll g\text{.}\) The chain of identies runs backward as well: \(g + h \sim g\) if and only if \(h \ll g\text{.}\)
Another principle is that if \(f \sim g\) and \(h \sim \ell\) then \(f \cdot h \sim g \cdot \ell\text{.}\) This is literally just the product rule for limits. The same is true for nonzero quotients, for the same reason.

Example 6.12.

We know \(x + 1/x \sim x\) and \(2 - e^{-x} \sim 2\text{,}\) therefore
\begin{equation*} \frac{x + 1/x}{2 - e^{-x}} \sim \frac{x}{2} \, . \end{equation*}
These two facts give important techniques for estimating. They allow you to clear away irrelevant terms: in any sum, every term that is much less than one of the others can be eliminated and the result will be asymptotic to what it was before. You can keep going with products and quotients.

Example 6.13.

Find a nice function asymptotically equal to \(\sqrt{x^2 + 1}\text{.}\) The notion of "nice" is subjective; here it means a function you’re comfortable with, can easily estimate, and so forth.
Because \(1 \ll x^2\) we can ignore the 1 and get \(\sqrt{x^2}\) which is equal to \(x\) for all positive \(x\text{.}\) Therefore, \(\sqrt{1 + x^2} \sim x\text{.}\)
Beware though, if \(f \sim g\) and \(h \sim \ell\text{,}\) it does not follow that \(f + h \sim g + \ell\) or \(f - h \sim g - \ell\text{.}\) Why? See the following self-check exercise.

Checkpoint 108.

Let \(f(x) = x + 1\) and \(g(x) = x + 1/x\text{.}\) Let \(h(x) = x\text{.}\) Evaluate the truth or falsity of these claims.
  1. \(f \sim g\)
  2. \(h \sim h\)
  3. \(f-h \sim g-h\)
Now explain what this has to do with the proposed “subtraction principle for asymptotic equality.”
Answer 1.
\(\text{true}\)
Answer 2.
\(\text{true}\)
Answer 3.
\(\text{false}\)
It should be obvious that the relation \(\sim\) is symmetric: \(f \sim g\) if and only if \(g \sim f\text{.}\) Formally,
\begin{equation*} \lim_{x \to \infty} \frac{f(x)}{g(x)} = 1 \Longleftrightarrow \lim_{x \to \infty} \frac{g(x)}{f(x)} = 1 \end{equation*}
because one is the reciprocal of the other. On the other hand, the relation \(f \ll g\) is anti-symmetric: it is not possible that both \(f \ll g\) and \(g \ll f\text{.}\)
It is good to have an understanding of the relative sizes of common functions. Here is a summary of some basic facts from this lesson, practice problems and homework problems.