Primitives, Proof based on theorems for differentiation

c.teixeira
Messages
39
Reaction score
0
Hi there!

If one would want to prove that the indefined integral :

\int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx.

Would this be apropriate:

A(x) = \int[f(x)+g(x)]dx;
B(x) = \int f(x)dx;
C(x) = \int g(x)dx.

And since the primitive of a fuction is another fuction whose derivative is the original fuction:

A'(x) = f(x) + g(x);
B'(x) + C'(x) = (B + C)'(x) = f(x) + g(x).

What would imply bt the Mean Value Theorem: A(x) = B(x) + C(x) + K.

Is this a approiate proof?

If so, who do we know that k = 0? Since there is no K in " \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx."

Regards,
 
Physics news on Phys.org
c.teixeira said:
If so, who do we know that k = 0?

k is not zero in general. The correct form is to define ##A(x) + k_A = \int [ f(x)+g(x) ] dx## for some constant ##k_A##. Similarly ##k_B## and ##k_C##, when you differentiate the constants vanish. You want to show ##k_A = k_B + k_C##.
 
pwsnafu said:
k is not zero in general. The correct form is to define ##A(x) + k_A = \int [ f(x)+g(x) ] dx## for some constant ##k_A##. Similarly ##k_B## and ##k_C##, when you differentiate the constants vanish. You want to show ##k_A = k_B + k_C##.

So, how can I prove that? And why would I want to show that? (##k_A = k_B + k_C##)

I don't understand. If my explnation is correct, and K is generally not 0, as you said, wouldn't that make \int[f(x) + g(x) dx = \int f(x)dx + \int g(x)dx + K instead?

What I undersandt from the above expression, is that, that is valid for any constant K, being it 0, or not. So basically, the k doen'st do much diference, wether it is 0 or otherwise. Am I right or wrong?

Thank you,
 
Also,

I believe that making A(x) + K_{A} = \int[f(x)+g(x)]dx, and then showing that K_{A} = K_{B} + K_{C}, is the same thing as defining A(x) = \int[f(x)+g(x)]dx, and then showing that K = 0.

Spivak says that concer for this Constants is merely an annoyance, but not knowing exactly why \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx, without the C for constant is bothering me.

Regards,
 
c.teixeira said:
Spivak says that concer for this Constants is merely an annoyance,

Introductory texts trivialise this because students are not interested in technicalities. But its not a mere annoyance. Consider two variables then ##\int f(x,y) \, dx = F(x,y) + g(y)## for some non-constant function g. So its not trivial.

Here's the run down: the derivative is a linear operator ##D : C^1(U) \rightarrow \mathbb{R}^U## for some open set ##U \subset \mathbb{R}##. However it is not injective because if two functions differ by a constant they have the same derivative. It is not surjective because there are some functions which can never be integrated.

Let ##I(U)## be the set of integrable functions on ##U##. I'm not going to explain what this space looks like, just accept it exists. The indefinite integral is then an operator ##\int \, \cdot \, dx : I(U) \rightarrow C^1(U)/\sim## where the equivalence relation is given by ##f = g \iff f - g## is constant over ##U##.

In this space ##A = A + k## for any constant k, so notice that equality in this space is not the same equality you were working with. If you have done modulo arithmetic, this is easy: you are working with modulo constants. Then the proof is trivial, because any constant is equivalent to zero:
##k_A = k_B + k_C = 0 \mod \text{const}##
 
Last edited:
c.teixeira said:
Hi there!

If one would want to prove that the indefined integral :

\int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx.

Would this be apropriate:

A(x) = \int[f(x)+g(x)]dx;
B(x) = \int f(x)dx;
C(x) = \int g(x)dx.

And since the primitive of a fuction is another fuction whose derivative is the original fuction:

A'(x) = f(x) + g(x);
B'(x) + C'(x) = (B + C)'(x) = f(x) + g(x).

What would imply bt the Mean Value Theorem: A(x) = B(x) + C(x) + K.

Is this a approiate proof?

If so, who do we know that k = 0? Since there is no K in " \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx."

Regards,
Yes, there is a "k" in \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx. In fact, there is a "constant of integration" in each of the three integrals in that equation.
 
HallsofIvy said:
Yes, there is a "k" in \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx. In fact, there is a "constant of integration" in each of the three integrals in that equation.

So, my little "proof", is correct, and "rigorously" we should have \int[f(x)+g(x)]dx = \int f(x)dx + \int g(x)dx + C?
Altough, the Constant C doesn't do any difference as a consequece to definite integrals.

So, I can think that Spivak didn't write the C, because of this non-consequece to definite integrals, and because of him stating 1 page earlier in the book that Constants where merely an annoyance?
 
No, you don't need the "C". The constant of integration is already in each of the integrals. When you write \int 3x^2 dx= x^3+ C you do not need "C" on the left because it is part of the integral. You do need it on the right because there is no longer an integral.
 
HallsofIvy said:
No, you don't need the "C". The constant of integration is already in each of the integrals. When you write \int 3x^2 dx= x^3+ C you do not need "C" on the left because it is part of the integral. You do need it on the right because there is no longer an integral.

Let:

f(x) = 3x^{2};
g(x) = 2x;

\int[3x^{2} + 2x] dx = x^{3} +x^{2} + C_{1},
\int3x^{2}dx = x^{3} + C_{2},
\int2xdx = x^{2} + C_{3}

Then, \int[3x^{2} + 2x] dx = \int3x^{2}dx + \int2xdx only if C_{1} = C_{2} + C _{3}, right?

So how exacltly do you prove that? Using pwsnafu advice?
 
Back
Top