- #1

- 1,011

- 0

## Homework Statement

y' = f'(x)

then y = f(x)

Let g(x) be any other integral of y' = f'(x)

That is, g'(x) = f'(x).

Now show that g(x) can differ from f(x) by at most a constant.

Then let w' = f'(x) - g'(x) = 0

Then w as a function of x must be w = constant

Hence g(x) = f(x) + constant.

Are they saying that because

w' = f'(x) - g'(x) = 0

=

w' = 0

Then because the slope of some curve is always zero, therefore it would be a line parrallel to the x-axis... that makes it a constant?

My real concern is.... how did they assume the g'(x) equals f'(x) ? Because if g(x) =/= f(x) then the difference of w' would be greater than a constant.....???