1. The problem statement, all variables and given/known data Suppose f is differentiable on J, c is in J0 and f'(c) > 0. Show that if f' is continuous at c, then f is strictly increasing on some neighborhood of c 2. Relevant equations Strictly increasing: If x < y then f(x) < f(y) Continuous: For all epsilon > 0 there exists a delta > 0 such that x in D union B(a;delta) implies that |f(x) - f(a)| < epsilon 3. The attempt at a solution I don't have any attempts to write down here. I'm mainly looking for a push in the right direction. I've been staring at the definitions and just can't see the easiest way to link them.