1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Limits and derivative: is this proof accurate enough?

  1. Jan 8, 2013 #1
    1. The problem statement, all variables and given/known data

    f is differentiable in ##\mathbb{R^+}## and
    ##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
    Prove that
    ##\displaystyle \lim_{x \to \infty}f(x)=0##

    3. The attempt at a solution

    I can split the limit in two:
    ##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##
    I consider the second one and say that, by definition of derivative I have:
    ##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h}##
    As f is differentiable, then the second limit exists and is 0.
    So, i have ##\displaystyle \lim_{x \to \infty} 0 =0##
    And then, by hypothesis:
    ##\displaystyle \lim_{x \to \infty} (f(x)+0)=0##
    Are the passages logically correct?
    thank you in advance!
     
  2. jcsd
  3. Jan 8, 2013 #2
    No, you can't do that. You can only do that if you know that both limits exist. So:

    [tex]\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)[/tex]

    is not true in general, but only if you know that the limits actually exist.

    As a counterexample:

    [tex]0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x[/tex]
     
  4. Jan 8, 2013 #3
    Thanks, you saved me from a major mistake!
    Maybe I should prove this way, then?:

    By definition of derivative:
    ##f(x)+f'(x)=f(x)+ \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h} ##=

    ## \displaystyle \lim_{h \to 0} f(x)-\frac{f(x)}{h}+\frac{f(x+h)}{h}## =
    ## \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##=

    and, by hypothesis:

    ##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##= 0

    Say:

    ##\displaystyle \lim_{x \to \infty}f(x)=L## so that the expression above becomes:

    ##\displaystyle \lim_{h \to 0}L(1-\frac{1}{\epsilon})+\frac{L}{\epsilon}##=0

    ##\displaystyle \lim_{h \to 0} L=0## ##\Rightarrow## ##\displaystyle \lim_{x\to \infty }f(x)=L##=0
     
  5. Jan 8, 2013 #4
    Now you assume that

    [tex]\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).[/tex]

    This is also not true in general.
     
  6. Jan 8, 2013 #5
    oh.. I've run out of possible good ideas then, any hint?
     
  7. Jan 8, 2013 #6
    The intuition is this: if x is very large, then f(x) is very close to -f'(x). In particular, if f(x) is very large, then -f'(x) is very negative. And thus f(x) decreases very fast.

    Try to work with an epsilon-delta definition. What does it mean that f(x) does not tend to 0??
     
  8. Jan 8, 2013 #7
    can I simply say then, that:
    ##\forall \epsilon >0####\exists \delta>0## such that if ##0<|x-x_0|<\delta \Rightarrow |f(x)+f'(x)-0|<\epsilon##
    And so, being the function >0 because it is defined on ##\mathbb{R^+}:
    ##0<|f(x)|< \epsilon - f'(x)|## ##\Rightarrow## ##|f(x)|< \epsilon## and for the squeeze rule ##\lim f(x)=0##
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook