Proving Strictly Increasing Derivative with Continuous Function at a Point

H2Pendragon
Messages
17
Reaction score
0

Homework Statement


Suppose f is differentiable on J, c is in J0 and f'(c) > 0. Show that if f' is continuous at c, then f is strictly increasing on some neighborhood of c


Homework Equations


Strictly increasing: If x < y then f(x) < f(y)
Continuous: For all epsilon > 0 there exists a delta > 0 such that x in D union B(a;delta) implies that |f(x) - f(a)| < epsilon

The Attempt at a Solution


I don't have any attempts to write down here. I'm mainly looking for a push in the right direction. I've been staring at the definitions and just can't see the easiest way to link them.
 
Physics news on Phys.org
H2Pendragon said:

Homework Statement


Suppose f is differentiable on J, c is in J0 and f'(c) > 0. Show that if f' is continuous at c, then f is strictly increasing on some neighborhood of c


Homework Equations


Strictly increasing: If x < y then f(x) < f(y)
Continuous: For all epsilon > 0 there exists a delta > 0 such that x in D union B(a;delta) implies that |f(x) - f(a)| < epsilon
Yes, and now take a= c and epsilon= f(c)/2 to show that f'(x)> 0 in (c-epsilon, c+epsilon).

The Attempt at a Solution


I don't have any attempts to write down here. I'm mainly looking for a push in the right direction. I've been staring at the definitions and just can't see the easiest way to link them.
 
HallsofIvy said:
Yes, and now take a= c and epsilon= f(c)/2 to show that f'(x)> 0 in (c-epsilon, c+epsilon).

Thanks for the reply. How does taking epsilon = f(c)/2 get me that f'(x) > 0?

I plugged it in and I end up getting that f'(c) - f(c)/2 < f(x) < f'(c) + f(c)/2

I understand, though, that finding f'(x) > 0 solves it. I'm just curious about this middle step. I assume I have to show that f'(c) = f(c)/2 ??
 
Did you mean to let epsilon = f'(c)/2?

Because that would solve it.

It would then be that f'(c) - f'(c)/2 < f'(x) => f(c)/2 < f'(x). Which means that f'(x) is strictly greater than 0. Thus f(x) is strictly increasing.

Is this right?
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top