Does the Limit of f'(x) Imply f''(x) Equals Zero?

terhorst
Messages
10
Reaction score
0
#34 on the much-discussed http://ftp.ets.org/pub/gre/Math.pdf" :

Suppose f is a differentiable function with \lim\limits_{x \to \infty }f(x)=K and \lim\limits_{x \to \infty }f'(x)=L for some K,L finite. Which must be true?
  1. L=0
  2. \lim\limits_{x \to \infty }f''(x)=0
  3. K=L
  4. f is constant.
  5. f' is constant.

Answer is 1. Is this because f might be C^1? Can you give an example of a function where the limit of the first derivative exists but the limit of the second derivative is not zero? Thanks!
 
Last edited by a moderator:
Physics news on Phys.org
I am a little befuddled by this. If 1. is true, it seems like 2. must also be true.
Let g(x)=f'(x)
We know
\lim_{x\rightarrow\infty} g(x) = K = 0
So it should follow that
\lim_{x\rightarrow\infty} g'(x) = \lim_{x\rightarrow\infty} f''(x) = L = 0
 
If the limit of the second derivative exists then it is zero. But it may not exist - even if the function is C^2. Try sin(x^2)/x^2.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top