Proof of (f'(b)+f'(a))/2+f(a) = f(b) for |b-a| = 1

  • Thread starter Thread starter computerex
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on proving the equation (f'(b)+f'(a))/2+f(a) = f(b) for the case where |b-a| = 1. The problem involves finding the function f(x) given that its derivative f'(x) equals 2x, leading to the solution f(x) = x^2 - 1. While the initial approach using averages of derivatives was attempted, it was concluded that the equation does not hold true in general, particularly for non-quadratic functions like y=x^3. However, it can be validated for quadratic functions of the form f(x)=ax^2+bx+c.

PREREQUISITES
  • Understanding of calculus, specifically derivatives and integrals
  • Familiarity with quadratic functions and their properties
  • Knowledge of the Mean Value Theorem in calculus
  • Ability to compute antiderivatives
NEXT STEPS
  • Study the Mean Value Theorem and its applications in calculus
  • Explore the properties of quadratic functions and their derivatives
  • Learn how to compute antiderivatives for various functions
  • Investigate the behavior of polynomial functions beyond quadratics, such as cubic functions
USEFUL FOR

Students and educators in calculus, mathematicians interested in function properties, and anyone looking to deepen their understanding of derivatives and integrals in relation to function behavior.

computerex
Messages
68
Reaction score
0
We were given a problem:
"The slope of f(x) at point x is twice the x value. f(2) = 3. Find f(3)."

I did this the conventional way:

f(x) = [tex]\int[/tex] 2x dx
= x^2 + c
Solving with the initial condition of f(2) = 3 gives me f(x) = x^2 - 1. So f(3) = 8.

My class did it a different way. They found f'(3) and f'(2), took the average and added that to the initial value 3, to get eight.

So can someone prove:
(f'(b)+f'(a))/2+f(a) = f(b) given |b-a| = 1 ?
 
Physics news on Phys.org
Well, no, because it's not true in general. Try it for say, y=x3.

It can be proven for all quadratics though. Use the fact that if |b-a|=1 then your 2 function values are of the form f(a) and b=a+1 so f(b)=f(a+1).

So just prove for all quadratics f(x)=ax2+bx+c

[tex]f'(x+1)+f'(x)=2(f(x+1)-f(x))[/tex]
 
Mentallic said:
Well, no, because it's not true in general. Try it for say, y=x3.

It can be proven for all quadratics though. Use the fact that if |b-a|=1 then your 2 function values are of the form f(a) and b=a+1 so f(b)=f(a+1).

So just prove for all quadratics f(x)=ax2+bx+c

[tex]f'(x+1)+f'(x)=2(f(x+1)-f(x))[/tex]

That's what I thought, but I wasn't sure.

I would go the way you did, OP. I would just find the antiderivative and go from there, which is what you exactly did.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
7
Views
2K
Replies
2
Views
1K
Replies
5
Views
2K
Replies
6
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 21 ·
Replies
21
Views
2K