I Lim f(x)/g(x) as x->∞ and relative growth rate of functions

1. Sep 18, 2016

Battlemage!

Everyone "knows" that
$$\lim_{x\rightarrow ∞}\frac{2^x}{x^2} = ∞.$$
We "know" this because 2x grows faster than x2. I use quotes because this is just what we're told in basic calculus classes. But what about a theorem for this? I've searched through google, looked through various university homework pages, and in all of them I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

Obviously this has been covered somewhere, and I just don't know how to find it. Earlier I was trying to find out information about the series (-1)n, and could find very little until I stumbled upon the phrase "Grandi's series." I'm hoping there is something out there describing this issue as well, so I can get some more precise knowledge. Simply looking at the function on my graphing calculator feels insufficient.

So in summary, I'm hoping someone can direct me to a more rigorous and precise set of definitions regarding limits of f(x)/g(x) with x approaching infinity where the relative growth rates of f(x) and g(x) are important in determining that limit. Specifically, why one type of function grows faster than another, and what the criteria are that determine that.

Any insight into this would be very appreciated!

2. Sep 18, 2016

Math_QED

Search for de l'Hôpital's rule.

3. Sep 18, 2016

Battlemage!

Thanks. I know l'Hôpital's rule as a tool to find limits, but what I'm after is a rigorous categorization of functions based on their relative growth rate. I can't seem to find one. (unless you're telling me the secret is hidden in a more careful examination of l'Hôpital's rule; maybe in the order of the derivatives?)

If I were to use l'Hôpital's rule on this function, in one iteration I'd end up with
$$\lim_{x\rightarrow ∞} \frac{2^xlog2}{2x}$$
and if I do it again that 2x will become a 2, but I'm assuming recursively I'll always have the 2x factor somewhere in there. That would tell me it's obviously infinity.

But that doesn't really explain anything I'm after. It's just a technique, at least as far as I have been taught. I'm trying to find something with a little more meat regarding why we know that a particular function will approach infinity faster than another one (and the obvious implications of this regarding limits of quotients of functions where one grows faster than the other).

4. Sep 18, 2016

Battlemage!

Just to move the discussion further, what if my limit was this?

$$\lim_{x\rightarrow ∞} \frac{x!}{x^2}$$

In this case, again, we know it's exploding to infinity. But trying l'Hôpital's rule isn't going to get an undergrad far, based on Wolfram's calculation for the derivative of x!. http://www.wolframalpha.com/input/?i=derivative+of+x!

But simply knowing that f(x) approaches infinity faster than g(x) implies that the limit f(x)/g(x) as x approaches infinity is infinity is enough to know that this blows up.

Now I'm interested in the characteristics of functions that let us know one grows faster than another.

5. Sep 18, 2016

Krylov

You might want to look into asymptotic notation. For example, in analysis it is customary to use "Big Oh" and "Little Oh" notation. One writes, for instance,
$$f(x) = O(g(x)) \text{ as } x \to \infty$$
whenever, by definition, $\tfrac{|f(x)|}{|g(x)|}$ remains bounded as $x \to \infty$. When, instead, the foregoing quotient tends to zero as $x \to \infty$, we write $f(x) = o(g(x))$.

Sometimes you can use power series to estimate the relative growth. For example, it is well-known that
$$e^x = 1 + x + \frac{1}{2}x^2 + \frac{1}{3!}x^3 + \ldots$$
grows faster than any polynomial $p(x) = a_0 + a_1 x + \ldots + a_n x^n$ (with, say, $a_n > 0$). To see why, divide the quotient above and below by $x^n$ to get
\begin{align*} \lim_{x \to \infty}\frac{e^x}{p(x)} &= \lim_{x \to \infty}\frac{x^{-n} + x^{-(n-1)} + \ldots + \frac{1}{n!} + \frac{1}{(n+1)!}x + \ldots }{a_0x^{-n} + a_1x^{-(n-1)} + \ldots + a_n }\\ &= \frac{1}{a_nn!}\lim_{x \to \infty}{\left[1 + \frac{1}{n+1}x + \frac{1}{n+2}x^2 + \ldots\right]} = \infty \end{align*}
Informally speaking, the finitely many powers included in $p$ simply cannot keep up with the infinitude of powers (with positive coefficients) of the exponential.

Last edited: Sep 18, 2016
6. Sep 18, 2016

Battlemage!

I know, that is what I'm saying. In the Wolfram link I posted it shows the gamma function, which I know nothing about. So my point is, l'Hôpital's rule isn't going to tell me much about the nature of one type of function versus another, and why one grows faster than another. Sure, I could use l'Hôpital's rule some of the time to calculate a limit, but why bother when I know, for example, that an exponential function grows faster than a simple polynomial? (or in the last example I posted, it's obvious x! grows way faster than x2) My real question is, why do I know that?

Now the literal answer to that question is, I know it's true because (1) I've been told it's true and (2) I've looked at both functions on my TI-83 and seen it. But this is not in any way a rigorous explanation of the generalities regarding one type of function growing faster than another, and how that relates to limits of f(x)/g(x).

At this point, however, I think I may have bitten off a bit more than I can chew. This might be pretty far above my level.

7. Sep 18, 2016

Krylov

I don't know exactly what your level is, but this topic is usually discussed rigorously in the introductory course on analysis of a single variable taken by first-year students in mathematics and others that have an interest.

Your initial question shows that you may fit well in the audience. (If you consider taking such a course, be sure to look for "analysis" and not just "calculus". There is nothing wrong with the latter, but courses that go by that name usually focus on computations instead of derivations and proofs.)

8. Sep 18, 2016

Stephen Tashi

There is no general theorem that would apply to two arbitrary functions. For an arbitrary function, we don't even know if the function can be expressed using a formula.

People learn to deal with growth ratios of "commonly encountered" functions by building experience with specific examples. The situation is going to be analous to the problem of determining if a series is convergent. There are theorems that specify tests for convergence, but there is no single "master theorem" that solves all problems.

If you study theorems dealing with "big O" results then you might find common "themes" used in the proof of those theorems.

9. Sep 19, 2016

Battlemage!

The introduction to analysis course at my school is an upper division course that doubles as a first year graduate student course, but you are right, it's probably about time I jump to the next level. I have seen things like what you posted above. I was just thinking the topic might be even more sophisticated than that, since really what you posted was just a general example proving that exponential functions always grow faster than polynomials (not that I'm complaining!), rather than some general theorem regarding all types of functions.

However, how many types of functions can there possibly be? If I wanted to just take a look at the rest... You just compared exponentials to polynomials. We would need to compare polynomials to logarithmic functions, and of course factorials to exponentials.

After that, what else is there besides combinations of those?

10. Sep 19, 2016

Krylov

Probably it depends a bit on the school (and the country) what are the exact contents of such a course, but I would just like to point out that you do not immediately need to go for a fully featured "real analysis" course. In particular, for an introduction to analysis you can start at the level of the first chapters of e.g. "The Way of Analysis" by Strichartz, "Principles of Mathematical Analysis" by Rudin or a similar book, but it may indeed be better to also attend lectures and tutorials as the feedback that you get on your work is most important.

That is correct, but see Stephen Tashi's comment. You are right that things can get a lot more sophisticated than what I posted and, essentially precisely for that reason, a general theorem does not exist. Even when we restrict ourselves to continuous functions on the real line, I know of no such theorem. (In fact, I would not even know how to formulate it.)

More than you and I like, probably. For an discrete example, see the Ackermann function. When you take that function, put its domain in bijection with $\{0,1,2,\ldots\}$ and interpolate linearly in between, you obtain a rather odd specimen on the (non-negative part of the) real line (from the point of view of growth properties) that is continuous and grows faster than anything I usually encounter.

11. Sep 20, 2016

Battlemage!

Thank link led me to Graham's number. Mein. Gott.

12. Sep 20, 2016

pwsnafu

A lot of these are done with the Squeeze theorem. For example
$0\leq \frac{n!}{n^n} \leq \frac{1}{n}\frac{2}{n}\cdot \frac{n-1}{n}\frac{n}{n} \leq \frac{1}{n} \cdot 1 \ldots 1\cdot 1 = \frac{1}{n}$
So $n^n$ is faster than $n!$