Does the Constant of Integration Play a Role in Integration by Summation?

nobahar
Messages
482
Reaction score
2
Hello!
I was pondering over the relationship between differentiation and integration, and I arrived at the question: does the constant of integration play any role in integration when I'm not interested in an antiderivative?

I think the answer is no, it doesn't play a role...

If I am integrating a function f(x), then I times it by an infinitessimally small increase in x, and sum togeather an infiinite number of these 'small areas'. I know the function, and I know it value at every point. But if it is the antiderivative, then I cannot identify whether the function involved a constant or not before differentiating, and so when integrating the constant must be included...

The reason I say this is because intgration by summation 'appears' to leave no ambiguity over the values of f(x). But:

\int \left \left dy = \int \left \left \frac{dy}{dx} \left \left dx

(ignore the dx on the LHS, I have corrected the Latex, but it is updating.)


does... Since here dy/dx has elliminated the constant, forever lost!

I don't know if I've completely misunderstood! But I feel perhaps there's an important point which I'm not getting, and it's throwing me off course quite a bit.

I would REALLY appreciate some help, any little points that you feel may guide me back in the right direction, or clear up some misunderstandings.
My main goal is to understand the relationship between intgration and differentiation.

Thanks in advance, nobahar.
 
Last edited:
Physics news on Phys.org
Do you mean a definite integral? If you need to find a definite integral
\int_a^b f(x) dx

any antiderivative of f(x) will give you the answer. Try it with the constant C with the antiderivative and you'll see that the constants cancel.
 
Many thanks for the reply Bohrok.
I was thinking more in terms of integration by summation, I just can't see where the constant of integration comes into this process.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top