Limit Theorem: Does f'(x) Approach 0 as x Goes to Infinity?

  • Context: Undergrad 
  • Thread starter Thread starter vincent_vega
  • Start date Start date
  • Tags Tags
    Limits
Click For Summary

Discussion Overview

The discussion revolves around whether the derivative of a function, f'(x), approaches 0 as x approaches infinity, given that the function itself, f(x), approaches 0. Participants explore the implications of this relationship, referencing the Mean Value Theorem and the conditions under which derivatives are defined.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • Some participants question whether a theorem exists that guarantees f'(x) approaches 0 if f(x) approaches 0 as x goes to infinity.
  • One participant provides an example, f(x) = sin(x^2)/x, to argue that f'(x) does not necessarily approach 0.
  • Another participant suggests using the Mean Value Theorem to analyze the behavior of f'(x) over intervals, but acknowledges that this approach may not be effective.
  • There is a discussion about the assumptions regarding the definition of f'(x) as x approaches infinity, with some arguing that this needs to be explicitly stated.
  • Some participants express uncertainty about the original post's assumptions and suggest that clarification from the original poster (OP) would be beneficial.
  • Concerns are raised about the clarity of the OP's question, particularly regarding the existence and limit of f'(x) as x approaches infinity.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether f'(x) must approach 0 if f(x) approaches 0. There are competing interpretations of the OP's question and the conditions under which f'(x) is defined.

Contextual Notes

Participants highlight the need for clear definitions and assumptions regarding the behavior of f'(x) as x approaches infinity, as well as the implications of the Mean Value Theorem in this context.

vincent_vega
Messages
32
Reaction score
0
suppose there is a function f(x), and it's limit as x goes to infinity is 0.

Is there a theorem that says it's derivative, f'(x), also approaches 0 as x goes to infinity?

Thanks.
 
Physics news on Phys.org
Probably not, since it is not true. Consider

f(x)=\frac{\sin(x^2)}{x}

What is true, that if f^\prime(x) has a limit, then this limit must be 0.
 
If f is differentiable in (-oo,oo) , use the mean value theorem:

f(b)-f(a)=f'(c)(b-a) . Maybe you can partition [0,oo)into [0,1],[0,2],...,[n,n+1].

Then:

f(1)=f(0)+f'(co) ;f(2)=f(1)+f'(c1) ;...f(n)=f(n-1)+f'(c_(n-1)).

Then you can find a closed form for f(n).
 
Bacle2 said:
If f is differentiable in (-oo,oo) , use the mean value theorem:
micromass's example shows this isn't going to work.
 
Of course I'm assuming f'(x) is defined as x-->oo , that is implied in my argument.

Basically, take [0,b] . Then

f(b)-f(0)=f'(c)(b)

If f' is defined everywhere and we let b-->oo , then the limit cannot be 0 unless f'(c) decreases to zero.

If the OP says "its derivative approaches 0 as x --> infinity" seems to me to assume that the derivative is defined as x-->oo.
 
Last edited:
Bacle2 said:
Of course I'm assuming f'(x) is defined as x-->oo , that is implied in my argument.
You mean, that the limit is defined? If your argument relies on that then it needs to be stated.
 
From the OP:" it's derivative, f'(x), also approaches 0 as x goes to infinity? "


This looks to me like an assumption that f'(x) is defined as x-->oo
 
Maybe the OP can clarify the conditions of the problem to eliminate ambiguity?
 
Bacle2 said:
From the OP:" its derivative, f'(x), also approaches 0 as x goes to infinity? "
This looks to me like an assumption that f'(x) is defined as x-->oo
No, that's what he's trying to prove. He didn't say "if f' approaches a limit, that limit is 0", so the most reasonable interpretation is that he wants to prove "f' approaches a limit, and that limit is 0".
 
  • #10
It's not clear to me either way. f'(x) is said to exist without any qualification; I see

no reason to assume it exists in a specific subset of the real line only, nor reason

to assume otherwise. In your interpretation, why didn't the OP say something like

is f'(x) defined, and if so , what is its limit. He refers to f'(x) which states that f'(x)

exists. It may exist somewhere or everywhere.

The problem is posed sloppily ; I think out of basic manners, the OP should clarify.
 
  • #11
Bacle2 said:
f'(x) is said to exist without any qualification

Then it does not follow that the limit ##\lim_{x\rightarrow\infty}f'(x)## is defined. The statement "f' exists" is limited to ##x \in \mathbb{R}## because the domain of f is the real numbers and not the extended reals. What happens as ##x\rightarrow\infty## is considered a separate condition, and must explicitly be mentioned.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
5
Views
2K
Replies
5
Views
1K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K