Mean Value Theorem: Proof & Claim

Click For Summary
SUMMARY

The discussion centers on the Mean Value Theorem (MVT) and its application in proving that a differentiable function \( f \) can be expressed in the form \( f(x) = ax^2 + bx + c \). The proof begins by establishing that there exists a constant \( a \) such that \( f'(x) = ax + b \) for all \( x \in I \). The participants explore the implications of this result, including the necessity of using a fixed point \( x_0 \) to simplify the proof and the conclusion that if \( f' = g' \), then \( f \) and \( g \) differ by a constant. The final proof confirms that \( f(x) = ax^2 + bx + c \) holds true under the conditions stated.

PREREQUISITES
  • Understanding of the Mean Value Theorem (MVT)
  • Knowledge of derivatives and their properties
  • Familiarity with polynomial functions and their forms
  • Basic concepts of integration and constant functions
NEXT STEPS
  • Study the implications of the Mean Value Theorem in calculus
  • Learn about the relationship between derivatives and integrals
  • Explore polynomial function properties and their applications
  • Investigate the concept of constant functions in the context of differentiation
USEFUL FOR

Students of calculus, mathematicians interested in function analysis, and educators teaching the Mean Value Theorem and its applications in real-world scenarios.

fishturtle1
Messages
393
Reaction score
82
Homework Statement
a) Suppose ##f## is twice differentiable on an open interval ##I## and ##f''(x) = 0## for all ##x \in I##. Show ##f## has the form ##f(x) = ax + b## for suitable constants ##a## and ##b##.

b) Suppose ##f## is three times differentiable on an open interval ##I## and ##f''' = 0## on ##I##. What form does ##f## have? Prove your claim.
Relevant Equations
Mean Value theorem: Let ##f## be a continuous function on ##[a, b]## that is differentiable on ##(a, b)##. Then there exists [at least one] ##x## in ##(a, b)## such that:

$$f'(x) = \frac{f(a) - f(b)}{a - b}$$


Theorem: Let ##f## be a differentiable function on ##(a,b)## such that ##f'(x) = 0## for all ##x \in (a,b)##. Then ##f## is a constant function on ##(a,b)##.
a) Proof: By theorem above, there exists a ##a \in \mathbb{R}## such that for all ##x \in I## we have ##f'(x) = a##. Let ##x, y \in I##. Then, by Mean Value Theorem,

$$a = \frac{f(x) - f(y)}{x - y}$$

This can be rewritten as ##f(x) = ax - ay + f(y)##. Now, let ##g(y) = -ay + f(y)##. Then ##g'(y) = \lim_{t \rightarrow y} \frac{g(t) - g(y)}{t - y} = \lim_{t \rightarrow y}\frac{-at + f(t) +ay - f(y)}{t - y} = \lim_{t\rightarrow y}\frac{-at + ay}{t - y} + \lim_{t\rightarrow y}\frac{f(t) - f(y)}{t - y} = -a + a = 0## So, ##g## is constant on ##I##. So there exists ##b \in \mathbb{R}## such that for all ##y \in I##, ##g(y) = b##. We can conclude ##f(x) = ax + b##. []

b) I claim that there is ##a, b, c## such that for all ##x \in I##: ##f(x) = ax^2 + bx + c##

Proof: By part a) there exists ##a, b \in \mathbb{R}## such that for all ##x \in I## we have ##f'(x) = ax + b##. Let ##x, y \in I##. By MVT there exists ##t \in I## such that $$at + b = \frac{f(x) - f(y)}{x - y}$$

This can be rewritten as ##f(x) = (at + b)x - y(at + b) + f(y)##. Let ##g(y) = -y(at + b) + f(y)##. Then, ##g'(y) = -(at + b) + (ay + b) = a(y - t)##.Did I use MVT incorrectly?
 
Physics news on Phys.org
A couple of points. It might have been simpler to take some fixed ##x_0 \in I##, rather than a second variable ##y##.

I can't see the conclusion to part b).
 
PeroK said:
A couple of points. It might have been simpler to take some fixed ##x_0 \in I##, rather than a second variable ##y##.

I can't see the conclusion to part b).
Thanks for the reply.

Where would I be able to fix ##x_0##? It seems for MVT, we choose an ##x, y \in I## and then are guaranteed a ##c \in I## such that ##f'(c) = \frac{f(x) - f(y)}{x - y}##?

Also, for part b) I meant I'm not sure how to continue. It seems ##g'(y) \neq 0## so ##g(y)## is not constant. (and I was expecting ##g(y)## to be constant). Also, I'm not sure what to do with ##(at + b)x = atx + bx##. It seems it would be easier if ##t = x## but I'm not sure how to get that.

I thought to use definition of derivative: ##\lim_{y \rightarrow x} \frac{f(y) - f(x)}{y - x} = f'(x) = ax + b## but haven't made progress with this either.
 
I would have taken ##x_0 \in I## and then from the MVT shown that For all ##x \in I##:

##a = \frac{f(x) - f(x_0)}{x - x_0}##
 
  • Informative
Likes fishturtle1
PeroK said:
I would have taken ##x_0 \in I## and then from the MVT shown that For all ##x \in I##:

##a = \frac{f(x) - f(x_0)}{x - x_0}##
OH ok so it would be something like this?: Fix ##x_0 \in I## and let ##x## be any element in ##I##. By MVT,
$$a = \frac{f(x) - f(x_0)}{x - x_0}$$
$$a(x - x_0) + f(x_0) = f(x)$$
$$ax - ax_0 + f(x_0) = f(x)$$
$$ax + b = f(x)$$
where ##b = -ax_0 + f(x_0)## is a constant since ##a, x_0, f(x_0)## are constants.[]

So this solves my problem with ##g## not being constant in part b.
 
  • Like
Likes PeroK
fishturtle1 said:
OH ok so it would be something like this?: Fix ##x_0 \in I## and let ##x## be any element in ##I##. By MVT,
$$a = \frac{f(x) - f(x_0)}{x - x_0}$$
$$a(x - x_0) + f(x_0) = f(x)$$
$$ax - ax_0 + f(x_0) = f(x)$$
$$ax + b = f(x)$$
where ##b = -ax_0 + f(x_0)## is a constant since ##a, x_0, f(x_0)## are constants.[]

So this solves my problem with ##g## not being constant in part b.

To finish off part b) you might have to use more than the MVT. I don't immediately see a way to finish it off using the same technique as you used for part a).
 
Hint. If ##f'(x) = g'(x)##, what can you say about ##(f-g)(x)##?
 
  • Like
Likes member 587159
PeroK said:
Hint. If ##f'(x) = g'(x)##, what can you say about ##(f-g)(x)##?
We can say ##(f-g)(x)## is constant.
 
fishturtle1 said:
We can say ##(f-g)(x)## is constant.
Okay. So, if you could find a single function for which ##f'''(x) = 0##, then you can find them all?

I'm going offline now.
 
  • Informative
Likes fishturtle1
  • #10
Here is how I would approach the problem:

Prove the following claim:

Let ##f,g: I \to \mathbb{R}## be differentiable functions where ##I## is an interval. If ##f'=g'## on ##I##, then there is a constant ##c## such that ##f=g+c##.

Proof: We have ##(f-g)'=0## and by your claim ##f-g## is constant. ##\quad \square##

Alright, let's see how this helps.

First, you have ##0'=0=f''=(f')'## so by the claim there is a constant ##a## such that ##f'=a##. Note that for all ##x\in I## we have ##(ax)'= a =f'(x)## so applying the claim a second time, you get that there is a constant b such that ##ax+b=f(x)## for all ##x\in I##.

You can do the other question by applying this claim three times. Let me know what you get.

EDIT: basically, the exercise is a good preparation for what you will see soon: indefinite integration.
 
Last edited by a moderator:
  • Informative
Likes fishturtle1
  • #11
PeroK said:
Okay. So, if you could find a single function for which ##f'''(x) = 0##, then you can find them all?
What I said was imprecise. I should have said that if you can find a function for which ##f'(x) = ax + b##, then you can find them all.
 
  • Informative
Likes fishturtle1
  • #12
Math_QED said:
Here is how I would approach the problem:

Prove the following claim:

Let ##f,g: I \to \mathbb{R}## be differentiable functions where ##I## is an interval. If ##f'=g'## on ##I##, then there is a constant ##c## such that ##f=g+c##.

Proof: We have ##(f-g)'=0## and by your claim ##f-g## is constant. ##\quad \square##

Alright, let's see how this helps.

First, you have ##0'=0=f''=(f')'## so by the claim there is a constant ##a## such that ##f'=a##. Note that for all ##x\in I## we have ##(ax)'= a =f'(x)## so applying the claim a second time, you get that there is a constant b such that ##ax+b=f(x)## for all ##x\in I##.

You can do the other question by applying this claim three times. Let me know what you get.

EDIT: basically, the exercise is a good preparation for what you will see soon: indefinite integration.

Proof: We have ##f''' = 0'''##. So, there exists ##f'' = 2a## for some ##2a \in \mathbb{R}##. Now, ##f'' - (2ax)' = 0##. So there exists ##b \in \mathbb{R}## such that ##f' - 2ax = b## i.e. ##f' - 2ax - b = 0##. Now, ##(ax^2 + bx)' = 2ax + b##. So, ##f' - (ax^2 + bx)' = 0##. Thus, there exists ##c \in \mathbb{R}## such that ##f - (ax^2 + bx) = c##. This can be rewritten as ##f(x) = ax^2 + bx + c##. ## \square##
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
3K
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
Replies
3
Views
2K
Replies
3
Views
2K
Replies
2
Views
2K
Replies
1
Views
1K
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K