Understanding the Proof: Integrals and Constant Functions Explained

  • Thread starter Thread starter Miike012
  • Start date Start date
  • Tags Tags
    Proof
Click For Summary
The discussion centers on the relationship between two functions, f(x) and g(x), which are both integrals of the same derivative, f'(x). It is established that if g'(x) = f'(x), then g(x) can only differ from f(x) by a constant, as shown through the function w(x) defined as w'(x) = f'(x) - g'(x). Since w'(x) equals zero, it indicates that w(x) is a constant function. The participants clarify that this constant difference arises from the nature of integration, where the integral of a derivative yields the original function plus a constant. The proof effectively demonstrates that any two functions with the same derivative must differ by a constant value.
Miike012
Messages
1,009
Reaction score
0

Homework Statement


y' = f'(x)
then y = f(x)

Let g(x) be any other integral of y' = f'(x)
That is, g'(x) = f'(x).

Now show that g(x) can differ from f(x) by at most a constant.

Then let w' = f'(x) - g'(x) = 0
Then w as a function of x must be w = constant
Hence g(x) = f(x) + constant.

Are they saying that because
w' = f'(x) - g'(x) = 0
=
w' = 0
Then because the slope of some curve is always zero, therefore it would be a line parrallel to the x-axis... that makes it a constant?

My real concern is... how did they assume the g'(x) equals f'(x) ? Because if g(x) =/= f(x) then the difference of w' would be greater than a constant...?
 
Physics news on Phys.org
I am not 100% sure of this but this my 2 cents.
consider this. f(x)=x+a (a being a real number) then f'(x)=1 now it is given that f'(x)=g'(x) the g'(x)=1 (or any other derivative of any given function. now
seeing that\int ] f'(x) dx =f(x)+c the same applied for g'(x) so there is always just a single constant difference i.e

f(x)=x+1\Rightarrow f'(x)=x=g'(x) \leftharpoondown g(x)=x+4 or what ever.
 
y' = f'(x) and y = f(x)
They state: let g(x) be any other integral of y' = f'(x)
This means g'(x) = y' = f'(x)
so g(x) = int(y')dx
g(x) = int(f'(x))dx
g(x) = f(x) + constant

To clarify your proof:

assign a new function w(x) such that it's derivative w'(x) = f'(x) - g'(x)
but f'(x) = g'(x) [stated above]
so w'(x) = 0
Now since w(x) is generally a function of x, and it's derivative w'(x) = 0, in this case this must mean w(x) = constant
Hence (integrating both sides and gathering up all constants)
constant = f(x) - g(x)
or f(x) = g(x) + constant
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 21 ·
Replies
21
Views
2K
Replies
2
Views
1K
Replies
2
Views
2K
Replies
8
Views
2K
Replies
9
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K