Use EVT and Fermats to prove there is a c such that f'(c)=0

  • Thread starter Thread starter NWeid1
  • Start date Start date
Click For Summary
To prove that there exists a point c in the interval (a, b) where f'(c) = 0, apply the Extreme Value Theorem (EVT) and Fermat's Theorem. Since f is differentiable on [a, b] and f'(a) < 0 < f'(b), EVT guarantees that f attains both a maximum and minimum in this interval. By Fermat's Theorem, if the maximum or minimum occurs at a point that is not an endpoint, then the derivative at that point must be zero. Therefore, at least one of the extrema must occur at a point within (a, b), leading to the conclusion that f'(c) = 0 for some c in that interval. This reasoning effectively demonstrates the existence of such a c.
NWeid1
Messages
81
Reaction score
0

Homework Statement


If f is differentiable on the interval [a,b] and f'(a)<0<f'(b), prove that there is a c with a<c<b for which f'(c)=0. (Hint: Use the Extreme Value Theorem and Fermat's Theorem.)


Homework Equations





The Attempt at a Solution


I feel like this should be an IVT problem, but I was having problems doing it so my teacher hinted at using EXT and Fermats but we haven't really used them much so I'm stuck :|
 
Physics news on Phys.org
There are a number of "Fermat's theorem"s. I expect you mean the one that says that if f'(x) is not 0, then x f(x) cannot be an etreme value. The EVT says that a continuous function has both max and min values on a closed, bounded, interval and Fermat's theorem say that if such a point is not an endpoint, and the derivative exists, then it must be 0. Since you are told that f is differentiable in this interval, all you need to show is that at least one of the max and min cannot be at an endpoint.

There is, by the way, another theorem that says that, while the derivative of a function is not necessarily continuous (which is why you cannot just immediately use the IVT) it is still true that f'(x) takes on all values between f'(a) and f'(b) on the interval [a, b]. That would make this very easy but I suspect you have not had that theorem yet.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
Replies
2
Views
2K
  • · Replies 105 ·
4
Replies
105
Views
8K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
4
Views
1K