(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

If f is differentiable in an interval I and f' >0 throughout I, except possibly at a single point where f' >=0 then f is stictly incresing on I

2. Relevant equations

3. The attempt at a solution

Ok what I have is I let f'(x) >0. I let a and b two points in the interval with a<b. then for some x in (a,b) with

F'x= (f(b)-f(a))/b-a

but f'(x)>0 for all x in (a,b) so

(f(b)-f(a))/(b-a) >0

since b-a>0 it follows that f(b)>f(a)

What you can see I have proved that it is incresing in the interval but im not sure what to do when f'=0 any help would be much appreciated as I have been told that it is not fully correct.

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Analysis. Please check

**Physics Forums | Science Articles, Homework Help, Discussion**