Understanding the Proof: Integrals and Constant Functions Explained

  • Thread starter Thread starter Miike012
  • Start date Start date
  • Tags Tags
    Proof
Click For Summary
SUMMARY

The discussion centers on the relationship between integrals and constant functions, specifically addressing the equation y' = f'(x) and its implications. It establishes that if g(x) is another integral of y' = f'(x), then g(x) can differ from f(x) by at most a constant. The proof hinges on the fact that if w' = f'(x) - g'(x) = 0, then w must be a constant function, leading to the conclusion that g(x) = f(x) + constant. This confirms that the derivatives of two functions being equal implies they differ only by a constant.

PREREQUISITES
  • Understanding of basic calculus concepts, specifically derivatives and integrals.
  • Familiarity with the Fundamental Theorem of Calculus.
  • Knowledge of function notation and properties of constant functions.
  • Ability to manipulate and solve equations involving derivatives.
NEXT STEPS
  • Study the Fundamental Theorem of Calculus in detail.
  • Explore the concept of antiderivatives and their properties.
  • Learn about the implications of constant differences in calculus.
  • Investigate examples of functions that differ by a constant and their graphical representations.
USEFUL FOR

Students of calculus, mathematics educators, and anyone seeking to deepen their understanding of integrals and the behavior of functions related to their derivatives.

Miike012
Messages
1,009
Reaction score
0

Homework Statement


y' = f'(x)
then y = f(x)

Let g(x) be any other integral of y' = f'(x)
That is, g'(x) = f'(x).

Now show that g(x) can differ from f(x) by at most a constant.

Then let w' = f'(x) - g'(x) = 0
Then w as a function of x must be w = constant
Hence g(x) = f(x) + constant.

Are they saying that because
w' = f'(x) - g'(x) = 0
=
w' = 0
Then because the slope of some curve is always zero, therefore it would be a line parrallel to the x-axis... that makes it a constant?

My real concern is... how did they assume the g'(x) equals f'(x) ? Because if g(x) =/= f(x) then the difference of w' would be greater than a constant...?
 
Physics news on Phys.org
I am not 100% sure of this but this my 2 cents.
consider this. f(x)=x+a (a being a real number) then f'(x)=1 now it is given that f'(x)=g'(x) the g'(x)=1 (or any other derivative of any given function. now
seeing that\int ] f'(x) dx =f(x)+c the same applied for g'(x) so there is always just a single constant difference i.e

f(x)=x+1\Rightarrow f'(x)=x=g'(x) \leftharpoondown g(x)=x+4 or what ever.
 
y' = f'(x) and y = f(x)
They state: let g(x) be any other integral of y' = f'(x)
This means g'(x) = y' = f'(x)
so g(x) = int(y')dx
g(x) = int(f'(x))dx
g(x) = f(x) + constant

To clarify your proof:

assign a new function w(x) such that it's derivative w'(x) = f'(x) - g'(x)
but f'(x) = g'(x) [stated above]
so w'(x) = 0
Now since w(x) is generally a function of x, and it's derivative w'(x) = 0, in this case this must mean w(x) = constant
Hence (integrating both sides and gathering up all constants)
constant = f(x) - g(x)
or f(x) = g(x) + constant
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 21 ·
Replies
21
Views
2K
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
8
Views
2K
Replies
9
Views
2K