Suppose f is differentiable on J, c is in J0 and f'(c) > 0. Show that if f' is continuous at c, then f is strictly increasing on some neighborhood of c
Strictly increasing: If x < y then f(x) < f(y)
Continuous: For all epsilon > 0 there exists a delta > 0 such that x in D union B(a;delta) implies that |f(x) - f(a)| < epsilon
The Attempt at a Solution
I don't have any attempts to write down here. I'm mainly looking for a push in the right direction. I've been staring at the definitions and just can't see the easiest way to link them.