Proof of f(b)<g(b) Using the Mean Value Theorem

Click For Summary

Homework Help Overview

The discussion revolves around proving that f(b) < g(b) using the Mean Value Theorem, given that f and g are continuous and differentiable functions on the interval [a, b], with f(a) = g(a) and f'(x) < g'(x) for a < x < b.

Discussion Character

  • Exploratory, Assumption checking

Approaches and Questions Raised

  • Participants discuss applying the Mean Value Theorem to the function h = f - g and explore the implications of the derivative conditions. There is a focus on the relationship between f(b) and g(b) based on the established inequalities and the continuity of the functions.

Discussion Status

Some participants have confirmed the application of the Mean Value Theorem and the resulting inequality. Others are questioning the assumptions regarding the interval and the implications of the inequality b - a > 0, leading to further exploration of the conditions under which the theorem applies.

Contextual Notes

Participants are navigating the implications of the Mean Value Theorem and the conditions of continuity and differentiability, while also addressing potential misunderstandings about the interval endpoints and their implications for the inequality.

tempneff
Messages
82
Reaction score
3
1. Suppose that f and g are continuous on [a,b] and differentiable on (a,b). Suppose also that f(a)=g(a) and f '(x)<g '(x) for a<x<b. Prove that f(b)<g(b). [Hint: Apply the Mean Value Theorem to the function h=f-g].
2. {[f(b)-f(a)]\b-a}=f'(c)
3.
I know:
If h(x) = f(x) - g(x) then
h(a) = f(a) - g(a) = 0 and
h is continuous on [a,b] differentiable on (a,b) so Mean Value Theorem applies
[h(b) - h(a)] / (b - a) = h'(c) for some c in (a,b). Therefore
[(f(b) - g(b)) - 0] / (b - a) = h'(c) and
h'(c)=f'(c) - g'(c) which is < 0 because f '(x)<g '(x) for a<x<b.
So [(f(b) - g(b)) - 0] / (b - a) < 0
Now...
I believe that f(b) - g(b) < 0 but I can't prove it. Any tips
 
Last edited:
Physics news on Phys.org
Everything is correct. So you've got

\frac{f(b)-g(b)}{b-a}&lt;0

Now you need that f(b)-g(b)<0. You know that b-a>0...
 
I know I am missing something very simple, but how can I say that b-a>0 just because a<x<b??
what is b= -1 and a= -2 then b-a <0?? I know it is something so insignificant ...
 
If b=-1 and a=-2, then b-a=-1-(-2)=-1+2=1>0.
If b>a, then it is always true that b-a>0. Just add -a to both sides...
 
I am a moron.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
22
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K