Lim f(x)/g(x) as x->∞ and relative growth rate of functions

In summary, the conversation discusses the common knowledge that the limit of (2^x)/(x^2) as x approaches infinity is infinity, due to 2^x growing faster than x^2. However, there is no established theorem or standard method for determining the relative growth rates of functions. The conversation also mentions searching for a theorem to explain this issue, similar to how "Grandi's series" explains the series (-1)^n. As a possible solution, the concept of asymptotic notation is suggested, specifically Big Oh and Little Oh notation. Power series can also be used to estimate relative growth rates, such as the example of e^x growing faster than any polynomial. It is also noted that l'Hôpital's
  • #1
Battlemage!
294
45
Everyone "knows" that
[tex]\lim_{x\rightarrow ∞}\frac{2^x}{x^2} = ∞.[/tex]
We "know" this because 2x grows faster than x2. I use quotes because this is just what we're told in basic calculus classes. But what about a theorem for this? I've searched through google, looked through various university homework pages, and in all of them I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

Obviously this has been covered somewhere, and I just don't know how to find it. Earlier I was trying to find out information about the series (-1)n, and could find very little until I stumbled upon the phrase "Grandi's series." I'm hoping there is something out there describing this issue as well, so I can get some more precise knowledge. Simply looking at the function on my graphing calculator feels insufficient.

So in summary, I'm hoping someone can direct me to a more rigorous and precise set of definitions regarding limits of f(x)/g(x) with x approaching infinity where the relative growth rates of f(x) and g(x) are important in determining that limit. Specifically, why one type of function grows faster than another, and what the criteria are that determine that.

Any insight into this would be very appreciated!
 
  • Like
Likes S.G. Janssens
Physics news on Phys.org
  • #2
Battlemage! said:
Everyone "knows" that
[tex]\lim_{x\rightarrow ∞}\frac{2^x}{x^2} = ∞.[/tex]
We "know" this because 2x grows faster than x2. I use quotes because this is just what we're told in basic calculus classes. But what about a theorem for this? I've searched through google, looked through various university homework pages, and in all of them I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

Obviously this has been covered somewhere, and I just don't know how to find it. Earlier I was trying to find out information about the series (-1)n, and could find very little until I stumbled upon the phrase "Grandi's series." I'm hoping there is something out there describing this issue as well, so I can get some more precise knowledge. Simply looking at the function on my graphing calculator feels insufficient.

So in summary, I'm hoping someone can direct me to a more rigorous and precise set of definitions regarding limits of f(x)/g(x) with x approaching infinity where the relative growth rates of f(x) and g(x) are important in determining that limit. Specifically, why one type of function grows faster than another, and what the criteria are that determine that.

Any insight into this would be very appreciated!

Search for de l'Hôpital's rule.
 
  • #3
Math_QED said:
Search for de l'Hôpital's rule.
Thanks. I know l'Hôpital's rule as a tool to find limits, but what I'm after is a rigorous categorization of functions based on their relative growth rate. I can't seem to find one. (unless you're telling me the secret is hidden in a more careful examination of l'Hôpital's rule; maybe in the order of the derivatives?)

If I were to use l'Hôpital's rule on this function, in one iteration I'd end up with
[tex]\lim_{x\rightarrow ∞} \frac{2^xlog2}{2x}[/tex]
and if I do it again that 2x will become a 2, but I'm assuming recursively I'll always have the 2x factor somewhere in there. That would tell me it's obviously infinity.

But that doesn't really explain anything I'm after. It's just a technique, at least as far as I have been taught. I'm trying to find something with a little more meat regarding why we know that a particular function will approach infinity faster than another one (and the obvious implications of this regarding limits of quotients of functions where one grows faster than the other).
 
  • #4
Just to move the discussion further, what if my limit was this?

[tex]\lim_{x\rightarrow ∞} \frac{x!}{x^2}[/tex]

In this case, again, we know it's exploding to infinity. But trying l'Hôpital's rule isn't going to get an undergrad far, based on Wolfram's calculation for the derivative of x!. http://www.wolframalpha.com/input/?i=derivative+of+x!

But simply knowing that f(x) approaches infinity faster than g(x) implies that the limit f(x)/g(x) as x approaches infinity is infinity is enough to know that this blows up.

Now I'm interested in the characteristics of functions that let us know one grows faster than another.
 
  • #5
You might want to look into asymptotic notation. For example, in analysis it is customary to use "Big Oh" and "Little Oh" notation. One writes, for instance,
$$
f(x) = O(g(x)) \text{ as } x \to \infty
$$
whenever, by definition, ##\tfrac{|f(x)|}{|g(x)|}## remains bounded as ##x \to \infty##. When, instead, the foregoing quotient tends to zero as ##x \to \infty##, we write ##f(x) = o(g(x))##.

Sometimes you can use power series to estimate the relative growth. For example, it is well-known that
$$
e^x = 1 + x + \frac{1}{2}x^2 + \frac{1}{3!}x^3 + \ldots
$$
grows faster than any polynomial ##p(x) = a_0 + a_1 x + \ldots + a_n x^n## (with, say, ##a_n > 0##). To see why, divide the quotient above and below by ##x^n## to get
$$
\begin{align*}
\lim_{x \to \infty}\frac{e^x}{p(x)} &= \lim_{x \to \infty}\frac{x^{-n} + x^{-(n-1)} + \ldots + \frac{1}{n!} + \frac{1}{(n+1)!}x + \ldots }{a_0x^{-n} + a_1x^{-(n-1)} + \ldots + a_n }\\
&= \frac{1}{a_nn!}\lim_{x \to \infty}{\left[1 + \frac{1}{n+1}x + \frac{1}{n+2}x^2 + \ldots\right]} = \infty
\end{align*}
$$
Informally speaking, the finitely many powers included in ##p## simply cannot keep up with the infinitude of powers (with positive coefficients) of the exponential.
 
Last edited:
  • Like
Likes Battlemage!
  • #6
Math_QED said:
Are you aware of the gamma function? Because otherwise, when you deal with a function like ##f: \mathbb{N} \rightarrow \mathbb{R}: x \mapsto x!## you can't use l'Hôpital's rule as this function is not differentiable. In terms of the gamma function, we could write ##x! = \Gamma (x+1)## and the gamma function is differentiable.
I know, that is what I'm saying. In the Wolfram link I posted it shows the gamma function, which I know nothing about. So my point is, l'Hôpital's rule isn't going to tell me much about the nature of one type of function versus another, and why one grows faster than another. Sure, I could use l'Hôpital's rule some of the time to calculate a limit, but why bother when I know, for example, that an exponential function grows faster than a simple polynomial? (or in the last example I posted, it's obvious x! grows way faster than x2) My real question is, why do I know that?

Now the literal answer to that question is, I know it's true because (1) I've been told it's true and (2) I've looked at both functions on my TI-83 and seen it. But this is not in any way a rigorous explanation of the generalities regarding one type of function growing faster than another, and how that relates to limits of f(x)/g(x).

At this point, however, I think I may have bitten off a bit more than I can chew. This might be pretty far above my level.
 
  • #7
Battlemage! said:
At this point, however, I think I may have bitten off a bit more than I can chew. This might be pretty far above my level.
I don't know exactly what your level is, but this topic is usually discussed rigorously in the introductory course on analysis of a single variable taken by first-year students in mathematics and others that have an interest.

Your initial question shows that you may fit well in the audience. (If you consider taking such a course, be sure to look for "analysis" and not just "calculus". There is nothing wrong with the latter, but courses that go by that name usually focus on computations instead of derivations and proofs.)
 
  • Like
Likes Battlemage!
  • #8
Battlemage! said:
I couldn't find any standardized way to determine if one function grows faster than another; no theorem describing this behavior precisely and why.

There is no general theorem that would apply to two arbitrary functions. For an arbitrary function, we don't even know if the function can be expressed using a formula.

People learn to deal with growth ratios of "commonly encountered" functions by building experience with specific examples. The situation is going to be analous to the problem of determining if a series is convergent. There are theorems that specify tests for convergence, but there is no single "master theorem" that solves all problems.

If you study theorems dealing with "big O" results then you might find common "themes" used in the proof of those theorems.
 
  • Like
Likes Battlemage! and member 587159
  • #9
Krylov said:
I don't know exactly what your level is, but this topic is usually discussed rigorously in the introductory course on analysis of a single variable taken by first-year students in mathematics and others that have an interest.

Your initial question shows that you may fit well in the audience. (If you consider taking such a course, be sure to look for "analysis" and not just "calculus". There is nothing wrong with the latter, but courses that go by that name usually focus on computations instead of derivations and proofs.)
The introduction to analysis course at my school is an upper division course that doubles as a first year graduate student course, but you are right, it's probably about time I jump to the next level. I have seen things like what you posted above. I was just thinking the topic might be even more sophisticated than that, since really what you posted was just a general example proving that exponential functions always grow faster than polynomials (not that I'm complaining!), rather than some general theorem regarding all types of functions.

However, how many types of functions can there possibly be? If I wanted to just take a look at the rest... You just compared exponentials to polynomials. We would need to compare polynomials to logarithmic functions, and of course factorials to exponentials.

After that, what else is there besides combinations of those?
 
  • #10
Battlemage! said:
The introduction to analysis course at my school is an upper division course that doubles as a first year graduate student course, but you are right, it's probably about time I jump to the next level.
Probably it depends a bit on the school (and the country) what are the exact contents of such a course, but I would just like to point out that you do not immediately need to go for a fully featured "real analysis" course. In particular, for an introduction to analysis you can start at the level of the first chapters of e.g. "The Way of Analysis" by Strichartz, "Principles of Mathematical Analysis" by Rudin or a similar book, but it may indeed be better to also attend lectures and tutorials as the feedback that you get on your work is most important.

Battlemage! said:
I was just thinking the topic might be even more sophisticated than that, since really what you posted was just a general example proving that exponential functions always grow faster than polynomials (not that I'm complaining!), rather than some general theorem regarding all types of functions.

That is correct, but see Stephen Tashi's comment. You are right that things can get a lot more sophisticated than what I posted and, essentially precisely for that reason, a general theorem does not exist. Even when we restrict ourselves to continuous functions on the real line, I know of no such theorem. (In fact, I would not even know how to formulate it.)

Battlemage! said:
However, how many types of functions can there possibly be? If I wanted to just take a look at the rest... You just compared exponentials to polynomials. We would need to compare polynomials to logarithmic functions, and of course factorials to exponentials.

After that, what else is there besides combinations of those?
More than you and I like, probably. For an discrete example, see the Ackermann function. When you take that function, put its domain in bijection with ##\{0,1,2,\ldots\}## and interpolate linearly in between, you obtain a rather odd specimen on the (non-negative part of the) real line (from the point of view of growth properties) that is continuous and grows faster than anything I usually encounter.
 
  • Like
Likes Battlemage!
  • #11
Thank link led me to Graham's number. Mein. Gott.
 
  • Like
Likes S.G. Janssens
  • #12
A lot of these are done with the Squeeze theorem. For example
##0\leq \frac{n!}{n^n} \leq \frac{1}{n}\frac{2}{n}\cdot \frac{n-1}{n}\frac{n}{n} \leq \frac{1}{n} \cdot 1 \ldots 1\cdot 1 = \frac{1}{n}##
So ##n^n## is faster than ##n!##
 
  • Like
Likes Battlemage!

1. What does "Lim f(x)/g(x) as x->∞" mean?

This notation represents the limit of the ratio between two functions, f(x) and g(x), as x approaches infinity. It is a way to describe the behavior of the two functions at extremely large values of x.

2. How do you calculate the limit of a function as x approaches infinity?

To calculate the limit of a function as x approaches infinity, you can either use algebraic techniques such as factoring and simplifying or use graphing software to visualize the behavior of the function at large values of x. If the limit does not exist, it means that the function does not approach a single value as x gets larger.

3. What is the significance of the limit of a function as x approaches infinity?

The limit of a function as x approaches infinity is important in understanding the overall behavior of the function. It can determine whether the function grows or decays at a certain rate, and can also be used to compare the relative growth rates of different functions.

4. How does the relative growth rate of functions affect their behavior at infinity?

The relative growth rate of functions directly impacts their behavior at infinity. If one function has a higher relative growth rate than another, it will grow or decay faster as x gets larger. This can also determine the end behavior of the functions, whether they approach a finite value or continue to increase/decrease without bound.

5. Can the limit of a function as x approaches infinity be equal to infinity?

Yes, the limit of a function as x approaches infinity can be equal to infinity. This means that the function grows without bound as x gets larger. However, it is also possible for the limit to not exist or approach a finite value, depending on the relative growth rates of the functions involved.

Similar threads

  • Calculus
Replies
1
Views
1K
Replies
36
Views
4K
Replies
11
Views
1K
  • Calculus
Replies
13
Views
2K
Replies
4
Views
1K
Replies
3
Views
2K
Replies
1
Views
961
Replies
1
Views
939
Replies
2
Views
792
Back
Top