Lim f(x)/g(x) as x->∞ and relative growth rate of functions

Click For Summary

Discussion Overview

The discussion centers around the limits of the ratio of functions as x approaches infinity, specifically exploring the relative growth rates of functions such as exponential, polynomial, and factorial functions. Participants seek a rigorous framework or theorem to understand why certain functions grow faster than others and how this affects the limits of their ratios.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express a desire for a formal theorem that categorizes functions based on their growth rates, noting that common calculus teachings do not provide sufficient rigor.
  • One participant mentions l'Hôpital's rule as a technique for finding limits but questions its ability to explain the underlying reasons for growth rate differences.
  • Another participant introduces asymptotic notation (Big Oh and Little Oh) as a way to describe relative growth rates, suggesting it could provide clarity on the topic.
  • A participant raises the example of the limit of the factorial function divided by a polynomial, questioning how to analyze such limits without a clear method for factorial growth.
  • Some participants discuss the gamma function as a differentiable alternative to factorials, indicating that it may offer more insight into growth comparisons.
  • There is a mention of the exponential function growing faster than polynomials, with a request for a more rigorous explanation of why this is the case.
  • One participant reflects on their own understanding and expresses concern that the topic may be beyond their current level, while another suggests that this is typically covered in introductory analysis courses.
  • It is noted that there is no general theorem applicable to all functions regarding their growth rates, emphasizing the complexity of the topic.

Areas of Agreement / Disagreement

Participants generally agree that there is a lack of formalized methods to rigorously determine growth rates of arbitrary functions, and multiple competing views on how to approach the topic remain unresolved.

Contextual Notes

The discussion highlights limitations in existing educational resources regarding the rigorous analysis of function growth rates and the applicability of techniques like l'Hôpital's rule to non-differentiable functions.

Who May Find This Useful

This discussion may be of interest to students and enthusiasts in mathematics, particularly those studying calculus and analysis, as well as individuals seeking a deeper understanding of function behavior in limits.

Battlemage!
Messages
292
Reaction score
44
Everyone "knows" that
\lim_{x\rightarrow ∞}\frac{2^x}{x^2} = ∞.
We "know" this because 2x grows faster than x2. I use quotes because this is just what we're told in basic calculus classes. But what about a theorem for this? I've searched through google, looked through various university homework pages, and in all of them I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

Obviously this has been covered somewhere, and I just don't know how to find it. Earlier I was trying to find out information about the series (-1)n, and could find very little until I stumbled upon the phrase "Grandi's series." I'm hoping there is something out there describing this issue as well, so I can get some more precise knowledge. Simply looking at the function on my graphing calculator feels insufficient.

So in summary, I'm hoping someone can direct me to a more rigorous and precise set of definitions regarding limits of f(x)/g(x) with x approaching infinity where the relative growth rates of f(x) and g(x) are important in determining that limit. Specifically, why one type of function grows faster than another, and what the criteria are that determine that.

Any insight into this would be very appreciated!
 
  • Like
Likes   Reactions: S.G. Janssens
Physics news on Phys.org
Battlemage! said:
Everyone "knows" that
\lim_{x\rightarrow ∞}\frac{2^x}{x^2} = ∞.
We "know" this because 2x grows faster than x2. I use quotes because this is just what we're told in basic calculus classes. But what about a theorem for this? I've searched through google, looked through various university homework pages, and in all of them I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

Obviously this has been covered somewhere, and I just don't know how to find it. Earlier I was trying to find out information about the series (-1)n, and could find very little until I stumbled upon the phrase "Grandi's series." I'm hoping there is something out there describing this issue as well, so I can get some more precise knowledge. Simply looking at the function on my graphing calculator feels insufficient.

So in summary, I'm hoping someone can direct me to a more rigorous and precise set of definitions regarding limits of f(x)/g(x) with x approaching infinity where the relative growth rates of f(x) and g(x) are important in determining that limit. Specifically, why one type of function grows faster than another, and what the criteria are that determine that.

Any insight into this would be very appreciated!

Search for de l'Hôpital's rule.
 
Math_QED said:
Search for de l'Hôpital's rule.
Thanks. I know l'Hôpital's rule as a tool to find limits, but what I'm after is a rigorous categorization of functions based on their relative growth rate. I can't seem to find one. (unless you're telling me the secret is hidden in a more careful examination of l'Hôpital's rule; maybe in the order of the derivatives?)

If I were to use l'Hôpital's rule on this function, in one iteration I'd end up with
\lim_{x\rightarrow ∞} \frac{2^xlog2}{2x}
and if I do it again that 2x will become a 2, but I'm assuming recursively I'll always have the 2x factor somewhere in there. That would tell me it's obviously infinity.

But that doesn't really explain anything I'm after. It's just a technique, at least as far as I have been taught. I'm trying to find something with a little more meat regarding why we know that a particular function will approach infinity faster than another one (and the obvious implications of this regarding limits of quotients of functions where one grows faster than the other).
 
Just to move the discussion further, what if my limit was this?

\lim_{x\rightarrow ∞} \frac{x!}{x^2}

In this case, again, we know it's exploding to infinity. But trying l'Hôpital's rule isn't going to get an undergrad far, based on Wolfram's calculation for the derivative of x!. http://www.wolframalpha.com/input/?i=derivative+of+x!

But simply knowing that f(x) approaches infinity faster than g(x) implies that the limit f(x)/g(x) as x approaches infinity is infinity is enough to know that this blows up.

Now I'm interested in the characteristics of functions that let us know one grows faster than another.
 
You might want to look into asymptotic notation. For example, in analysis it is customary to use "Big Oh" and "Little Oh" notation. One writes, for instance,
$$
f(x) = O(g(x)) \text{ as } x \to \infty
$$
whenever, by definition, ##\tfrac{|f(x)|}{|g(x)|}## remains bounded as ##x \to \infty##. When, instead, the foregoing quotient tends to zero as ##x \to \infty##, we write ##f(x) = o(g(x))##.

Sometimes you can use power series to estimate the relative growth. For example, it is well-known that
$$
e^x = 1 + x + \frac{1}{2}x^2 + \frac{1}{3!}x^3 + \ldots
$$
grows faster than any polynomial ##p(x) = a_0 + a_1 x + \ldots + a_n x^n## (with, say, ##a_n > 0##). To see why, divide the quotient above and below by ##x^n## to get
$$
\begin{align*}
\lim_{x \to \infty}\frac{e^x}{p(x)} &= \lim_{x \to \infty}\frac{x^{-n} + x^{-(n-1)} + \ldots + \frac{1}{n!} + \frac{1}{(n+1)!}x + \ldots }{a_0x^{-n} + a_1x^{-(n-1)} + \ldots + a_n }\\
&= \frac{1}{a_nn!}\lim_{x \to \infty}{\left[1 + \frac{1}{n+1}x + \frac{1}{n+2}x^2 + \ldots\right]} = \infty
\end{align*}
$$
Informally speaking, the finitely many powers included in ##p## simply cannot keep up with the infinitude of powers (with positive coefficients) of the exponential.
 
Last edited:
  • Like
Likes   Reactions: Battlemage!
Math_QED said:
Are you aware of the gamma function? Because otherwise, when you deal with a function like ##f: \mathbb{N} \rightarrow \mathbb{R}: x \mapsto x!## you can't use l'Hôpital's rule as this function is not differentiable. In terms of the gamma function, we could write ##x! = \Gamma (x+1)## and the gamma function is differentiable.
I know, that is what I'm saying. In the Wolfram link I posted it shows the gamma function, which I know nothing about. So my point is, l'Hôpital's rule isn't going to tell me much about the nature of one type of function versus another, and why one grows faster than another. Sure, I could use l'Hôpital's rule some of the time to calculate a limit, but why bother when I know, for example, that an exponential function grows faster than a simple polynomial? (or in the last example I posted, it's obvious x! grows way faster than x2) My real question is, why do I know that?

Now the literal answer to that question is, I know it's true because (1) I've been told it's true and (2) I've looked at both functions on my TI-83 and seen it. But this is not in any way a rigorous explanation of the generalities regarding one type of function growing faster than another, and how that relates to limits of f(x)/g(x).

At this point, however, I think I may have bitten off a bit more than I can chew. This might be pretty far above my level.
 
Battlemage! said:
At this point, however, I think I may have bitten off a bit more than I can chew. This might be pretty far above my level.
I don't know exactly what your level is, but this topic is usually discussed rigorously in the introductory course on analysis of a single variable taken by first-year students in mathematics and others that have an interest.

Your initial question shows that you may fit well in the audience. (If you consider taking such a course, be sure to look for "analysis" and not just "calculus". There is nothing wrong with the latter, but courses that go by that name usually focus on computations instead of derivations and proofs.)
 
  • Like
Likes   Reactions: Battlemage!
Battlemage! said:
I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

There is no general theorem that would apply to two arbitrary functions. For an arbitrary function, we don't even know if the function can be expressed using a formula.

People learn to deal with growth ratios of "commonly encountered" functions by building experience with specific examples. The situation is going to be analous to the problem of determining if a series is convergent. There are theorems that specify tests for convergence, but there is no single "master theorem" that solves all problems.

If you study theorems dealing with "big O" results then you might find common "themes" used in the proof of those theorems.
 
  • Like
Likes   Reactions: Battlemage! and member 587159
Krylov said:
I don't know exactly what your level is, but this topic is usually discussed rigorously in the introductory course on analysis of a single variable taken by first-year students in mathematics and others that have an interest.

Your initial question shows that you may fit well in the audience. (If you consider taking such a course, be sure to look for "analysis" and not just "calculus". There is nothing wrong with the latter, but courses that go by that name usually focus on computations instead of derivations and proofs.)
The introduction to analysis course at my school is an upper division course that doubles as a first year graduate student course, but you are right, it's probably about time I jump to the next level. I have seen things like what you posted above. I was just thinking the topic might be even more sophisticated than that, since really what you posted was just a general example proving that exponential functions always grow faster than polynomials (not that I'm complaining!), rather than some general theorem regarding all types of functions.

However, how many types of functions can there possibly be? If I wanted to just take a look at the rest... You just compared exponentials to polynomials. We would need to compare polynomials to logarithmic functions, and of course factorials to exponentials.

After that, what else is there besides combinations of those?
 
  • #10
Battlemage! said:
The introduction to analysis course at my school is an upper division course that doubles as a first year graduate student course, but you are right, it's probably about time I jump to the next level.
Probably it depends a bit on the school (and the country) what are the exact contents of such a course, but I would just like to point out that you do not immediately need to go for a fully featured "real analysis" course. In particular, for an introduction to analysis you can start at the level of the first chapters of e.g. "The Way of Analysis" by Strichartz, "Principles of Mathematical Analysis" by Rudin or a similar book, but it may indeed be better to also attend lectures and tutorials as the feedback that you get on your work is most important.

Battlemage! said:
I was just thinking the topic might be even more sophisticated than that, since really what you posted was just a general example proving that exponential functions always grow faster than polynomials (not that I'm complaining!), rather than some general theorem regarding all types of functions.

That is correct, but see Stephen Tashi's comment. You are right that things can get a lot more sophisticated than what I posted and, essentially precisely for that reason, a general theorem does not exist. Even when we restrict ourselves to continuous functions on the real line, I know of no such theorem. (In fact, I would not even know how to formulate it.)

Battlemage! said:
However, how many types of functions can there possibly be? If I wanted to just take a look at the rest... You just compared exponentials to polynomials. We would need to compare polynomials to logarithmic functions, and of course factorials to exponentials.

After that, what else is there besides combinations of those?
More than you and I like, probably. For an discrete example, see the Ackermann function. When you take that function, put its domain in bijection with ##\{0,1,2,\ldots\}## and interpolate linearly in between, you obtain a rather odd specimen on the (non-negative part of the) real line (from the point of view of growth properties) that is continuous and grows faster than anything I usually encounter.
 
  • Like
Likes   Reactions: Battlemage!
  • #11
Thank link led me to Graham's number. Mein. Gott.
 
  • Like
Likes   Reactions: S.G. Janssens
  • #12
A lot of these are done with the Squeeze theorem. For example
##0\leq \frac{n!}{n^n} \leq \frac{1}{n}\frac{2}{n}\cdot \frac{n-1}{n}\frac{n}{n} \leq \frac{1}{n} \cdot 1 \ldots 1\cdot 1 = \frac{1}{n}##
So ##n^n## is faster than ##n!##
 
  • Like
Likes   Reactions: Battlemage!

Similar threads

  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 36 ·
2
Replies
36
Views
6K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
5
Views
1K