Register to reply 
Why do we use antiderivatives to find the values of definite integrals? 
Share this thread: 
#1
Jan2413, 04:33 PM

P: 273

It seems like we calculate integration by doing the reverse of derivation. Differentiation is basically just using shortcuts for differentiation by first principles (e.g. power rule). If integration by first principles is the Riemann sum, then why don't we use shortcuts of the Riemann sum to do integration? Why do we instead use reverse derivation?
Bare in mind, my riemann sums is extremely weak. 


#2
Jan2413, 05:13 PM

P: 1,611

Anti derivatives and methods of finding them merely speed up the long process of integration which we would otherwise have to do with a Riemann sum every time ( May I note how time consuming it is ). 


#3
Jan2413, 05:26 PM

P: 784

Why can't we just say that antiderivatives are the 'shortcut' for Riemann summing?



#4
Jan2413, 06:57 PM

Mentor
P: 21,409

Why do we use antiderivatives to find the values of definite integrals?
For indefinite integrals, the idea is to find a function whose derivative is the function in the indefinite integral  i.e., finding the antiderivative of the function in the integral. $$ \int e^x~dx$$ 


#5
Jan2613, 11:31 AM

P: 273

I think I didn't ask my question properly, so I've made a mindmap to make my question more clear.



#6
Jan2613, 01:22 PM

Mentor
P: 21,409

Your mind map is not very useful, IMO, at least beyond the 2nd row.
Under Differentiation you have the limit definition of the derivative and the various rules that are specific applications of the limit definition (constant multiple rule, sum and difference rules, product rule, quotient rule, etc.) In other words, these techniques would be leaves that come off the limit definition. Riemann sums are the basic way of evaluating definite integrals. They have nothing to do with indefinite integrals. To evaluate an indefinite integral, you are essentially using the differentiation rules backwards. Every time we develop a new differentiation formula, we get a integral formula for free. For example, there is the sum rule: d/dx(f(x) + g(x)) = f'(x) + g'(x), which implies that ## \int ( f'(x) + g'(x) )dx = f(x) + g(x) + \text{an arbitrary constant}## This formula is usually presented as the sum rule for integration: ## \int ( f(x) + g(x) )dx = F(x) + G(x) + \text{an arbitrary constant}##, where F'(x) = f(x) and G'(x) = g(x). IOW, F and G are antiderivatives of f and g, respectively. The technique of substitution in integrals, is nothing more than the differentiation chain rule, working backwards. The Fundamental Theorem of Calculus isn't part of differentiation  what it does is show 1. that differentiation and antidifferentiation are essentially inverse processes $$ \frac{d}{dx} \int_a^x f(t)~dt = f(x)$$ 2. how to evaluate a definite integral using the antiderivative of the integrand. $$\int_a^b f(t)dt = F(b)  F(a)$$ where F is an antiderivative of f (i.e., F' = f). 


#7
Jan2613, 02:03 PM

PF Gold
P: 330

I liked your Mind Map. Can I ask what you used to draw it? 


#8
Jan2613, 02:09 PM

P: 273

Mark44, I understand everything in your post. I think my issue is that I'm having trouble communicating my question due to my poor English and because I might have used the wrong terms by accident.
I'll try an alternate method. I'll list a list of assumptions that I've made and maybe you can tell me if any are incorrect. Thank you by the way for taking the time to reply.



#9
Jan2613, 04:29 PM

P: 461

I think your separation of the two is superficial. We could just as easily (actually it would probably be much harder, but certainly possible) define the Riemann Integral where the upper limit is a variable, and the [itex]\Delta x_i[/itex] is a function of this upper limit. I suspect that this would be terribly difficult to work with, but from this we could define a derivative as an antiintegral. Then we would find some rules that work with integrals so that people didn't have to go back to squareone everytime. But, then somebody out there would be asking why we don't do it the other way around.
I think you might have been getting at this with one of your mindmap comments, but Riemann integral is not usually described by multiplying an infinite sum by an infinitesimal amount (although I'm pretty sure it can be  I don't find it particularly helpful). I think it is much better thought of as an infinite sum of a function times an infinitesimal amount. I'm not saying you were wrong, but I've found a lot of physics students having trouble by thinking about it the way you stated it. 


#10
Jan2613, 05:46 PM

Mentor
P: 21,409




#11
Jan2613, 06:24 PM

P: 3

$$\text{This integral:}~~\int F'(x)dx=F(x)+C~~\text{begins the wrong theory.}$$ Nobody proved it and it doesn't make sense. $$\text{It would be correct to go on such way:}~~F(x)=\int \limits_{x_0}^x f(t)~dt,~~F(x_0)=0;~~~~~C=\int\limits_0^C~dt.$$ $$\text{If}~~f(t)=\frac{1}{t}:~~~F(x)=\int\limits_1^x f(t)dt=ln(x)=ln\left(\frac{x}{1}\right);~~~\int \limits_{t_1}^{t_2}\frac{dt}{t}=ln\left(\frac{t_2}{t_1}\right);$$ $$F(b)F(a)=\int\limits_{a}^{b}\frac{dx}{x}=ln\left(\frac{b}{a}\right);~~~~\in t \frac{dx}{x}=lnx+C ~~\text{is wrong!}.$$ Because: $$y=F(x),~~ u=F(x)+C,~~u(x,p)=F(x)+F(p)_{p=F^{(1)}(C)};$$ $$f(x)=F'(x)=\frac{dF(x)}{dx};$$ $$f(x)=\frac {\partial u(x,p)}{\partial x}=\frac{\partial (F(x)+C)}{\partial x};$$ $$\int\limits_{x_0}^x f(t)\partial t=F(x)+C.$$  "The conditions imposed on functions, become a source of difficulties which will manage to be avoided only by means of new researches about the principles of integral calculus" Thomas Ioannes Stiltes. ... 


#12
Jan2713, 09:11 AM

P: 461

[itex]\int f(x)dx[/itex] means the antiderivative, so Mark is correct. It doesn't have to be proved, it is a definition.



#13
Jan2813, 12:34 PM

P: 3

$$ dx \cdot\frac{d}{dx} \int_a^x f(t)~dt = f(x)\cdot dx;$$ $$\int d \int_a^x f(t)~dt =\int f(x)~dx;$$ $$ \int_a^x f(t)~dt =\int f(x)~dx;$$ $$\int f(x)~dx= F(x)F(a)!$$ 


#14
Jan2813, 01:04 PM

P: 784




#15
Jan2813, 04:37 PM

P: 3




#16
Jan2813, 05:08 PM

Sci Advisor
P: 839




#17
Jan2813, 05:25 PM

P: 784

##\frac{dy}{dy}##, which is not a derivative operator but in fact a derivative itself can't even be considered a fraction rigorously, though for simplicities sake you can often 'fake' it being a fraction as long as you are careful. 


#18
Jan2813, 06:06 PM

Mentor
P: 21,409




Register to reply 
Related Discussions  
Second derivatives to find max and min values then sketch graph  Calculus & Beyond Homework  6  
Definite Integrals  Average Values, Integration Rules?  Calculus & Beyond Homework  1  
Don't definite Integrals find area?  Calculus & Beyond Homework  2  
Find series given values of derivatives  Calculus & Beyond Homework  3  
Given values, find derivatives  Calculus & Beyond Homework  2 