Use EVT and Fermats to prove there is a c such that f'(c)=0

  • Thread starter Thread starter NWeid1
  • Start date Start date
Click For Summary
SUMMARY

The discussion focuses on proving the existence of a point \( c \) in the interval \( (a, b) \) such that \( f'(c) = 0 \) using the Extreme Value Theorem (EVT) and Fermat's Theorem. The EVT states that a continuous function on a closed interval attains both maximum and minimum values, while Fermat's Theorem asserts that if \( f'(x) \) is not zero at a point, then that point cannot be an extremum. The key conclusion is that since \( f \) is differentiable on \( [a, b] \) and \( f'(a) < 0 < f'(b) \), at least one extremum must occur within the interval, leading to \( f'(c) = 0 \).

PREREQUISITES
  • Understanding of the Extreme Value Theorem (EVT)
  • Familiarity with Fermat's Theorem regarding derivatives
  • Knowledge of differentiable functions and their properties
  • Basic concepts of calculus, particularly the Mean Value Theorem
NEXT STEPS
  • Study the formal proof of the Extreme Value Theorem (EVT)
  • Review Fermat's Theorem and its implications for critical points
  • Explore the Mean Value Theorem and its relationship to EVT
  • Practice problems involving differentiable functions and their extrema
USEFUL FOR

Students studying calculus, particularly those focusing on the properties of differentiable functions and the application of theorems related to extrema. This discussion is beneficial for anyone preparing for advanced mathematics or calculus examinations.

NWeid1
Messages
81
Reaction score
0

Homework Statement


If f is differentiable on the interval [a,b] and f'(a)<0<f'(b), prove that there is a c with a<c<b for which f'(c)=0. (Hint: Use the Extreme Value Theorem and Fermat's Theorem.)


Homework Equations





The Attempt at a Solution


I feel like this should be an IVT problem, but I was having problems doing it so my teacher hinted at using EXT and Fermats but we haven't really used them much so I'm stuck :|
 
Physics news on Phys.org
There are a number of "Fermat's theorem"s. I expect you mean the one that says that if f'(x) is not 0, then x f(x) cannot be an etreme value. The EVT says that a continuous function has both max and min values on a closed, bounded, interval and Fermat's theorem say that if such a point is not an endpoint, and the derivative exists, then it must be 0. Since you are told that f is differentiable in this interval, all you need to show is that at least one of the max and min cannot be at an endpoint.

There is, by the way, another theorem that says that, while the derivative of a function is not necessarily continuous (which is why you cannot just immediately use the IVT) it is still true that f'(x) takes on all values between f'(a) and f'(b) on the interval [a, b]. That would make this very easy but I suspect you have not had that theorem yet.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
Replies
2
Views
2K
  • · Replies 105 ·
4
Replies
105
Views
11K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
4
Views
1K