# Help me understand proof.

## Homework Statement

y' = f'(x)
then y = f(x)

Let g(x) be any other integral of y' = f'(x)
That is, g'(x) = f'(x).

Now show that g(x) can differ from f(x) by at most a constant.

Then let w' = f'(x) - g'(x) = 0
Then w as a function of x must be w = constant
Hence g(x) = f(x) + constant.

Are they saying that because
w' = f'(x) - g'(x) = 0
=
w' = 0
Then because the slope of some curve is always zero, therefore it would be a line parrallel to the x-axis... that makes it a constant?

My real concern is.... how did they assume the g'(x) equals f'(x) ? Because if g(x) =/= f(x) then the difference of w' would be greater than a constant.....???

Related Calculus and Beyond Homework Help News on Phys.org
I am not 100% sure of this but this my 2 cents.
consider this. f(x)=x+a (a being a real number) then f'(x)=1 now it is given that f'(x)=g'(x) the g'(x)=1 (or any other derivative of any given function. now
seeing that$$\int$$ ] f'(x) dx =f(x)+c the same applied for g'(x) so there is always just a single constant difference i.e

f(x)=x+1$$\Rightarrow$$ f'(x)=x=g'(x) $$\leftharpoondown$$ g(x)=x+4 or what ever.

y' = f'(x) and y = f(x)
They state: let g(x) be any other integral of y' = f'(x)
This means g'(x) = y' = f'(x)
so g(x) = int(y')dx
g(x) = int(f'(x))dx
g(x) = f(x) + constant