Limits and derivative: is this proof accurate enough?

Click For Summary

Homework Help Overview

The discussion revolves around the behavior of a differentiable function \( f \) defined on \( \mathbb{R^+} \) and the implications of the limit condition \( \lim_{x \to \infty} (f(x) + f'(x)) = 0 \). Participants are tasked with proving that \( \lim_{x \to \infty} f(x) = 0 \) based on this condition.

Discussion Character

  • Conceptual clarification, Assumption checking, Exploratory

Approaches and Questions Raised

  • Some participants attempt to split the limit into two parts, questioning whether this is valid without confirming the existence of both limits. Others provide counterexamples to illustrate potential pitfalls in this reasoning.

Discussion Status

The discussion is ongoing, with participants exploring various approaches to the proof. Some have suggested using the definition of the derivative and considering the implications of the limit condition, while others have raised concerns about the validity of certain assumptions. There is a recognition of the need for careful reasoning regarding limit operations.

Contextual Notes

Participants are navigating the complexities of limit operations and the implications of differentiability, with some expressing uncertainty about how to proceed given the constraints of the problem. The discussion reflects a mix of attempts to clarify definitions and explore alternative proof strategies.

Felafel
Messages
170
Reaction score
0

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##
I consider the second one and say that, by definition of derivative I have:
##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h}##
As f is differentiable, then the second limit exists and is 0.
So, i have ##\displaystyle \lim_{x \to \infty} 0 =0##
And then, by hypothesis:
##\displaystyle \lim_{x \to \infty} (f(x)+0)=0##
Are the passages logically correct?
thank you in advance!
 
Physics news on Phys.org
Felafel said:

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##

No, you can't do that. You can only do that if you know that both limits exist. So:

[tex]\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)[/tex]

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

[tex]0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x[/tex]
 
micromass said:
No, you can't do that. You can only do that if you know that both limits exist. So:

[tex]\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)[/tex]

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

[tex]0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x[/tex]

Thanks, you saved me from a major mistake!
Maybe I should prove this way, then?:

By definition of derivative:
##f(x)+f'(x)=f(x)+ \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h} ##=

## \displaystyle \lim_{h \to 0} f(x)-\frac{f(x)}{h}+\frac{f(x+h)}{h}## =
## \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##=

and, by hypothesis:

##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##= 0

Say:

##\displaystyle \lim_{x \to \infty}f(x)=L## so that the expression above becomes:

##\displaystyle \lim_{h \to 0}L(1-\frac{1}{\epsilon})+\frac{L}{\epsilon}##=0

##\displaystyle \lim_{h \to 0} L=0## ##\Rightarrow## ##\displaystyle \lim_{x\to \infty }f(x)=L##=0
 
Now you assume that

[tex]\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).[/tex]

This is also not true in general.
 
micromass said:
Now you assume that

[tex]\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).[/tex]

This is also not true in general.

oh.. I've run out of possible good ideas then, any hint?
 
The intuition is this: if x is very large, then f(x) is very close to -f'(x). In particular, if f(x) is very large, then -f'(x) is very negative. And thus f(x) decreases very fast.

Try to work with an epsilon-delta definition. What does it mean that f(x) does not tend to 0??
 
can I simply say then, that:
##\forall \epsilon >0####\exists \delta>0## such that if ##0<|x-x_0|<\delta \Rightarrow |f(x)+f'(x)-0|<\epsilon##
And so, being the function >0 because it is defined on ##\mathbb{R^+}:
##0<|f(x)|< \epsilon - f'(x)|## ##\Rightarrow## ##|f(x)|< \epsilon## and for the squeeze rule ##\lim f(x)=0##
 

Similar threads

Replies
7
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
17
Views
3K