Limits and derivative: is this proof accurate enough?

Felafel
Messages
170
Reaction score
0

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##
I consider the second one and say that, by definition of derivative I have:
##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h}##
As f is differentiable, then the second limit exists and is 0.
So, i have ##\displaystyle \lim_{x \to \infty} 0 =0##
And then, by hypothesis:
##\displaystyle \lim_{x \to \infty} (f(x)+0)=0##
Are the passages logically correct?
thank you in advance!
 
Physics news on Phys.org
Felafel said:

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##

No, you can't do that. You can only do that if you know that both limits exist. So:

\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x
 
micromass said:
No, you can't do that. You can only do that if you know that both limits exist. So:

\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x

Thanks, you saved me from a major mistake!
Maybe I should prove this way, then?:

By definition of derivative:
##f(x)+f'(x)=f(x)+ \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h} ##=

## \displaystyle \lim_{h \to 0} f(x)-\frac{f(x)}{h}+\frac{f(x+h)}{h}## =
## \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##=

and, by hypothesis:

##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##= 0

Say:

##\displaystyle \lim_{x \to \infty}f(x)=L## so that the expression above becomes:

##\displaystyle \lim_{h \to 0}L(1-\frac{1}{\epsilon})+\frac{L}{\epsilon}##=0

##\displaystyle \lim_{h \to 0} L=0## ##\Rightarrow## ##\displaystyle \lim_{x\to \infty }f(x)=L##=0
 
Now you assume that

\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).

This is also not true in general.
 
micromass said:
Now you assume that

\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).

This is also not true in general.

oh.. I've run out of possible good ideas then, any hint?
 
The intuition is this: if x is very large, then f(x) is very close to -f'(x). In particular, if f(x) is very large, then -f'(x) is very negative. And thus f(x) decreases very fast.

Try to work with an epsilon-delta definition. What does it mean that f(x) does not tend to 0??
 
can I simply say then, that:
##\forall \epsilon >0####\exists \delta>0## such that if ##0<|x-x_0|<\delta \Rightarrow |f(x)+f'(x)-0|<\epsilon##
And so, being the function >0 because it is defined on ##\mathbb{R^+}:
##0<|f(x)|< \epsilon - f'(x)|## ##\Rightarrow## ##|f(x)|< \epsilon## and for the squeeze rule ##\lim f(x)=0##
 
Back
Top