# Why do we use anti-derivatives to find the values of definite integrals?

by tahayassen
Tags: antiderivatives, definite, integrals, values
 P: 273 It seems like we calculate integration by doing the reverse of derivation. Differentiation is basically just using short-cuts for differentiation by first principles (e.g. power rule). If integration by first principles is the Riemann sum, then why don't we use short-cuts of the Riemann sum to do integration? Why do we instead use reverse derivation? Bare in mind, my riemann sums is extremely weak.
P: 1,338
 Quote by tahayassen It seems like we calculate integration by doing the reverse of derivation. Differentiation is basically just using short-cuts for differentiation by first principles (e.g. power rule). If integration by first principles is the Riemann sum, then why don't we use short-cuts of the Riemann sum to do integration? Why do we instead use reverse derivation? Bare in mind, my riemann sums is extremely weak.
I find the point of the Riemann sum is to prove a function is integrable. It's nice that they can also give us the value of the integral on that particular interval as n → ∞.

Anti derivatives and methods of finding them merely speed up the long process of integration which we would otherwise have to do with a Riemann sum every time ( May I note how time consuming it is ).
 P: 784 Why can't we just say that anti-derivatives are the 'short-cut' for Riemann summing?
Mentor
P: 21,067

## Why do we use anti-derivatives to find the values of definite integrals?

 Quote by tahayassen It seems like we calculate integration by doing the reverse of derivation.
We say differentiation, not derivation.
 Quote by tahayassen Differentiation is basically just using short-cuts for differentiation by first principles (e.g. power rule).
No, it's not. Differentiation is the action of finding the derivative of some function by any means. Differentiation includes doing so by the limit definition (first principle) or by such shortcuts as the power rule, product rule, quotient rule, etc.
 Quote by tahayassen If integration by first principles is the Riemann sum, then why don't we use short-cuts of the Riemann sum to do integration? Why do we instead use reverse derivation?
There's some confusion here, between indefinite integrals and definite integrals. For definite integrals, one way of evaluating them is using a Riemann sum.

For indefinite integrals, the idea is to find a function whose derivative is the function in the indefinite integral - i.e., finding the antiderivative of the function in the integral.

 Quote by Zondrina I find the point of the Riemann sum is to prove a function is integrable. It's nice that they can also give us the value of the integral on that particular interval as n → ∞. Anti derivatives and methods of finding them merely speed up the long process of integration which we would otherwise have to do with a Riemann sum every time .
What would the Riemann sum look like for this (indefinite) integral?
$$\int e^x~dx$$
 P: 273 I think I didn't ask my question properly, so I've made a mindmap to make my question more clear. Attached Thumbnails
 Mentor P: 21,067 Your mind map is not very useful, IMO, at least beyond the 2nd row. Under Differentiation you have the limit definition of the derivative and the various rules that are specific applications of the limit definition (constant multiple rule, sum and difference rules, product rule, quotient rule, etc.) In other words, these techniques would be leaves that come off the limit definition. Riemann sums are the basic way of evaluating definite integrals. They have nothing to do with indefinite integrals. To evaluate an indefinite integral, you are essentially using the differentiation rules backwards. Every time we develop a new differentiation formula, we get a integral formula for free. For example, there is the sum rule: d/dx(f(x) + g(x)) = f'(x) + g'(x), which implies that ## \int ( f'(x) + g'(x) )dx = f(x) + g(x) + \text{an arbitrary constant}## This formula is usually presented as the sum rule for integration: ## \int ( f(x) + g(x) )dx = F(x) + G(x) + \text{an arbitrary constant}##, where F'(x) = f(x) and G'(x) = g(x). IOW, F and G are antiderivatives of f and g, respectively. The technique of substitution in integrals, is nothing more than the differentiation chain rule, working backwards. The Fundamental Theorem of Calculus isn't part of differentiation - what it does is show 1. that differentiation and antidifferentiation are essentially inverse processes $$\frac{d}{dx} \int_a^x f(t)~dt = f(x)$$ 2. how to evaluate a definite integral using the antiderivative of the integrand. $$\int_a^b f(t)dt = F(b) - F(a)$$ where F is an antiderivative of f (i.e., F' = f).
P: 305
 Quote by tahayassen I think I didn't ask my question properly, so I've made a mindmap to make my question more clear.

Can I ask what you used to draw it?
P: 273
Mark44, I understand everything in your post. I think my issue is that I'm having trouble communicating my question due to my poor English and because I might have used the wrong terms by accident.

I'll try an alternate method. I'll list a list of assumptions that I've made and maybe you can tell me if any are incorrect. Thank you by the way for taking the time to reply.
1. Differentiation by using the limit definition is essentially taking the limit of x/y where both x and y approach zero
2. A rigorous proof of the product rule involves using the limit definition. Therefore, the product rule is a shortcut of differentiation by first principles.
3. Therefore, you can differentiate a function either by: 1) limit definition 2) shortcuts for the limit definition such as the product rule.
4. Integration by using the limit of the Riemann sum is essentially multiplying an infinite sum times an infinitesimally small value
5. Integration using the limit of the Riemann sum will always give you the definite integral.
6. As far as integration using the limit of the Riemann sum is concerned, the indefinite integral is unnecessary.
7. Finding the definite integral using the limit of the Riemann sum is limited and not very powerful.
8. A better method is to find the indefinite integral (or anti-derivative) and use the indefinite integral to calculate the definite integral using the fundamental theorem of calculus.
9. Therefore, you can find the definite integral of a function either by: 1) limit of the Riemann sum 2) finding the anti-derivative and then using fundamental theorem of calculus to find the definite integral using the anti-derivative.
10. Anti-derivative is reverse differentiation.
11. Both differentiation and integration can be done using first principles. But differentiation has a shortcut for its first principles while integration's "shortcut" relies on reversing differentiation (with the use of anti-derivative).
12. The reason why integration does not have a shortcut is because multiplying an infinite sum times an infinitesimally small value is not as easy (or is not as predictable?) as taking the limit of x/y where both x and y approach zero.

 Quote by rollingstein I liked your Mind Map. Can I ask what you used to draw it?
http://www.text2mindmap.com/ :P
 P: 428 I think your separation of the two is superficial. We could just as easily (actually it would probably be much harder, but certainly possible) define the Riemann Integral where the upper limit is a variable, and the $\Delta x_i$ is a function of this upper limit. I suspect that this would be terribly difficult to work with, but from this we could define a derivative as an anti-integral. Then we would find some rules that work with integrals so that people didn't have to go back to square-one everytime. But, then somebody out there would be asking why we don't do it the other way around. I think you might have been getting at this with one of your mindmap comments, but Riemann integral is not usually described by multiplying an infinite sum by an infinitesimal amount (although I'm pretty sure it can be - I don't find it particularly helpful). I think it is much better thought of as an infinite sum of a function times an infinitesimal amount. I'm not saying you were wrong, but I've found a lot of physics students having trouble by thinking about it the way you stated it.
Mentor
P: 21,067
 Quote by tahayassen Mark44, I understand everything in your post. I think my issue is that I'm having trouble communicating my question due to my poor English and because I might have used the wrong terms by accident.
Your English is very good, and better than that of some native speakers I have seen. I agree that you might have used some terms incorrectly, but that comes with the territory when you are learning something new.
 Quote by tahayassen I'll try an alternate method. I'll list a list of assumptions that I've made and maybe you can tell me if any are incorrect. Thank you by the way for taking the time to reply. Differentiation by using the limit definition is essentially taking the limit of x/y where both x and y approach zero A rigorous proof of the product rule involves using the limit definition. Therefore, the product rule is a shortcut of differentiation by first principles. Therefore, you can differentiate a function either by: 1) limit definition 2) shortcuts for the limit definition such as the product rule. Integration by using the limit of the Riemann sum is essentially multiplying an infinite sum times an infinitesimally small value
1. That's (above) pretty much it if your partition is divided into subintervals of the same size. It's possible to divide the partition into variable-length subintervals, with smaller subintervals where the function you are integrating increases or decreases rapidly, and larger subintervals where the function is relatively flat.

In this case you would be adding up an infinite number of terms, where each term is the product of a function value and the width of the subinterval.
 Quote by tahayassen Integration using the limit of the Riemann sum will always give you the definite integral. As far as integration using the limit of the Riemann sum is concerned, the indefinite integral is unnecessary. Finding the definite integral using the limit of the Riemann sum is limited and not very powerful.
I would say that it is laborious, but very powerful. Differentiation is straightforward, but there are very many functions for which it is very difficult or impossible to integrate analytically to get a "nice" answer. In cases like this the alternative is to use numerical methods to carry out the integration, and these techniques are variations of the Riemann sum technique you've learned.
 Quote by tahayassen A better method is to find the indefinite integral (or anti-derivative) and use the indefinite integral to calculate the definite integral using the fundamental theorem of calculus.
This is better only if the function you're integrating has a "nice" antiderivative, and there's no guarantee that this is true.
 Quote by tahayassen Therefore, you can find the definite integral of a function either by: 1) limit of the Riemann sum 2) finding the anti-derivative and then using fundamental theorem of calculus to find the definite integral using the anti-derivative. Anti-derivative is reverse differentiation. Both differentiation and integration can be done using first principles. But differentiation has a shortcut for its first principles while integration's "shortcut" relies on reversing differentiation (with the use of anti-derivative). The reason why integration does not have a shortcut is because multiplying an infinite sum times an infinitesimally small value is not as easy (or is not as predictable?) as taking the limit of x/y where both x and y approach zero.
 Quote by tahayassen
P: 3
 Quote by Mark44 The Fundamental Theorem of Calculus... 1. that differentiation and antidifferentiation are essentially inverse processes $$\frac{d}{dx} \int_a^x f(t)~dt = f(x)$$ 2. how to evaluate a definite integral using the antiderivative of the integrand. $$\int_a^b f(t)dt = F(b) - F(a)$$ where F is an antiderivative of f (i.e., F' = f).
I think that on it the correct theory comes to an end.
$$\text{This integral:}~~\int F'(x)dx=F(x)+C~~\text{begins the wrong theory.}$$
Nobody proved it and it doesn't make sense.

$$\text{It would be correct to go on such way:}~~F(x)=\int \limits_{x_0}^x f(t)~dt,~~F(x_0)=0;~~~~~C=\int\limits_0^C~dt.$$

$$\text{If}~~f(t)=\frac{1}{t}:~~~F(x)=\int\limits_1^x f(t)dt=ln(x)=ln\left(\frac{x}{1}\right);~~~\int \limits_{t_1}^{t_2}\frac{dt}{t}=ln\left(\frac{t_2}{t_1}\right);$$

$$F(b)-F(a)=\int\limits_{a}^{b}\frac{dx}{x}=ln\left(\frac{b}{a}\right);~~~~\in t \frac{dx}{x}=ln|x|+C -~~\text{is wrong!}.$$

Because:

$$y=F(x),~~ u=F(x)+C,~~u(x,p)=F(x)+F(p)_{p=F^{(-1)}(C)};$$

$$f(x)=F'(x)=\frac{dF(x)}{dx};$$
$$f(x)=\frac {\partial u(x,p)}{\partial x}=\frac{\partial (F(x)+C)}{\partial x};$$
$$\int\limits_{x_0}^x f(t)\partial t=F(x)+C.$$
----------------------------------------------------------------------
"The conditions imposed on functions, become a source of difficulties which will manage to be avoided only by means of new researches about the principles of integral calculus"

Thomas Ioannes Stiltes. ...
 P: 428 $\int f(x)dx$ means the antiderivative, so Mark is correct. It doesn't have to be proved, it is a definition.
P: 3
 Quote by DrewD $\int f(x)dx$ means the antiderivative, so Mark is correct. It doesn't have to be proved, it is a definition.
$$\frac{d}{dx} \int_a^x f(t)~dt = f(x);$$

$$dx \cdot\frac{d}{dx} \int_a^x f(t)~dt = f(x)\cdot dx;$$

$$\int d \int_a^x f(t)~dt =\int f(x)~dx;$$

$$\int_a^x f(t)~dt =\int f(x)~dx;$$

$$\int f(x)~dx= F(x)-F(a)!$$
P: 784
 Quote by mishin050 $$\frac{d}{dx} \int_a^x f(t)~dt = f(x);$$ $$dx \cdot\frac{d}{dx} \int_a^x f(t)~dt = f(x)\cdot dx;$$ $$\int d \int_a^x f(t)~dt =\int f(x)~dx;$$ $$\int_a^x f(t)~dt =\int f(x)~dx;$$ $$\int f(x)~dx= F(x)-F(a)!$$
This doesn't work, not the least of the reasons being that you are treating ##\frac{d}{dx}## as a fraction.
P: 3
 Quote by Vorde This doesn't work, not the least of the reasons being that you are treating ##\frac{d}{dx}## as a fraction.
##\displaystyle\frac{d}{dx}=\lim_{\Delta x\to 0}\frac{\Delta}{\Delta x}##
P: 779
 Quote by mishin050 ##\displaystyle\frac{d}{dx}=\lim_{\Delta x\to 0}\frac{\Delta}{\Delta x}##
The right hand side is not mathematically well-defined, and furthermore that not the definition of d/dx. As Vorde said, you can't treat d/dx as a fraction.
P: 784
 Quote by pwsnafu The right hand side is not mathematically well-defined, and furthermore that not the definition of d/dx. As Vorde said, you can't treat d/dx as a fraction.
And just to add on to this, ##\frac{d}{dx}## is just notation for a 'derivative operator', i.e. ##\frac{d}{dx}## is saying "take the derivative of this:".

##\frac{dy}{dy}##, which is not a derivative operator but in fact a derivative itself can't even be considered a fraction rigorously, though for simplicities sake you can often 'fake' it being a fraction as long as you are careful.
Mentor
P: 21,067
 Quote by mishin050 ##\displaystyle\frac{d}{dx}=\lim_{\Delta x\to 0}\frac{\Delta}{\Delta x}##
 Quote by pwsnafu The right hand side is not mathematically well-defined, and furthermore that not the definition of d/dx. As Vorde said, you can't treat d/dx as a fraction.
And the right side isn't defined, either, as Δ by itself doesn't mean anything in this context.

 Related Discussions Calculus & Beyond Homework 6 Calculus & Beyond Homework 1 Calculus & Beyond Homework 2 Calculus & Beyond Homework 3 Calculus & Beyond Homework 2