Homework Help: Use EVT and Fermats to prove there is a c such that f'(c)=0

1. Dec 4, 2011

NWeid1

1. The problem statement, all variables and given/known data
If f is differentiable on the interval [a,b] and f'(a)<0<f'(b), prove that there is a c with a<c<b for which f'(c)=0. (Hint: Use the Extreme Value Theorem and Fermat's Theorem.)

2. Relevant equations

3. The attempt at a solution
I feel like this should be an IVT problem, but I was having problems doing it so my teacher hinted at using EXT and Fermats but we haven't really used them much so I'm stuck :|

2. Dec 5, 2011

HallsofIvy

There are a number of "Fermat's theorem"s. I expect you mean the one that says that if f'(x) is not 0, then x f(x) cannot be an etreme value. The EVT says that a continous function has both max and min values on a closed, bounded, interval and Fermat's theorem say that if such a point is not an endpoint, and the derivative exists, then it must be 0. Since you are told that f is differentiable in this interval, all you need to show is that at least one of the max and min cannot be at an endpoint.

There is, by the way, another theorem that says that, while the derivative of a function is not necessarily continuous (which is why you cannot just immediately use the IVT) it is still true that f'(x) takes on all values between f'(a) and f'(b) on the interval [a, b]. That would make this very easy but I suspect you have not had that theorem yet.