# Difference between integral and antiderivative?

• I
Hello,

I still don't really understand what an antiderivative is, besides its ability to "undo" derivatives, its relation to integrals, and what the difference between the two even is. It would also be great to know how to visualize an antiderivative. I've tried looking further into the fundamental theory of calculus is, but something isn't really clicking. Can someone try explaining this to me, please?

Thank you so much!

makemoneyu

Ssnow
Gold Member
Algebraically the antiderivative is the ''inverse'' of a derivation. Let ##f(x)## a continuous function, we say that ##F(x)## is an antiderivative (or primitive) of ##f## if ##F'(x)=f(x)## (assuming that ##F## has derivative). As example is ##f(x)=x## then an antiderivative is ##F(x)=\frac{x^{2}}{2}## because ## F'(x)=x=f(x)##. We observe that when you find an antiderivative ##F(x)## you have infinity antiderivatives because also ##F(x)+c## (where ##c## is a constant ex 2,3,...) is, in fact ##(F(x)+c)'=F'(x)=f(x)##. The family of functions ##F(x)+c## is also called the integral of ##f(x)## respect ##x## and yhe usual notation is ##\int f(x)dx=F(x)+c##.

makemoneyu and egio
Algebraically the antiderivative is the ''inverse'' of a derivation. Let ##f(x)## a continuous function, we say that ##F(x)## is an antiderivative (or primitive) of ##f## if ##F'(x)=f(x)## (assuming that ##F## has derivative). As example is ##f(x)=x## then an antiderivative is ##F(x)=\frac{x^{2}}{2}## because ## F'(x)=x=f(x)##. We observe that when you find an antiderivative ##F(x)## you have infinity antiderivatives because also ##F(x)+c## (where ##c## is a constant ex 2,3,...) is, in fact ##(F(x)+c)'=F'(x)=f(x)##. The family of functions ##F(x)+c## is also called the integral of ##f(x)## respect ##x## and yhe usual notation is ##\int f(x)dx=F(x)+c##.
Ah, interesting! This clarifies a whole lot. Thanks for simplifying it for me!!

mathwonk
Homework Helper
So yes an antiderivative of a function f, is another function F such that F' = f. But there is no guarantee such a function exists! E.g. the greatest integer function does not have an antiderivative in that sense, because every derivative has the intermediate value property, and the greatest integer function jumps in certain places.

So having said what an antiderivative means, the next question is to know which functions have them, and then how to find them when they exist. The fundamental theorem gives a partial answer to this question. I.e. the FTC says that every continuious function f has antiderivatives, and that in fact the integral of f, i.e. its area function, is one antiderivative of f. Then integral of course is defined as a limit of Riemann sums.

Hello,

I still don't really understand what an antiderivative is, besides its ability to "undo" derivatives, its relation to integrals, and what the difference between the two even is. It would also be great to know how to visualize an antiderivative. I've tried looking further into the fundamental theory of calculus is, but something isn't really clicking. Can someone try explaining this to me, please?

Thank you so much!

The idea is to find a way of evaluating the signed area under the graph of some function, ##y=f(x)##.

This area can be evaluated approximately as the sum of the areas of ##N## small rectangles that just fit inside the area. We can make these rectangles thinner to get a better approximation to the real area, and we can do this by making ##N## bigger. If we calculate a closed form expression for the area, ##A(N)##, in terms of ##N## (which may be tricky), we can take the limit ##\displaystyle A = \lim_{N \to \infty} A(N)## to find the true area, ##A##.

That approach is the basis for the earliest formal theory of integration, called Riemann integration. To make it more rigorous, you bound the area with rectangles giving areas larger and smaller than ##A##, then show that for a given function, these tend to the same limit (and you need to deal with a couple of other things, but that's the general idea). ##A(N)## is called a Riemann sum, and the value of its limit is called an integral. We write this as ##\int f(x) \ dx##, where ##\int## is from the latin "summa" meaning "sum".

It is tricky to find areas in this way though, as finding the closed form expression ##A(N)## is hard. Newton and Leibnitz found a better way. They noticed the following: if you take a graph ##y=f(x)##, and draw a graph of the associated area function ##A(x)## (where the height of ##A(x)## gives the area enclosed by ##[0,x]##), then the slope/gradient/steepness of ##A(x)## is simply the value of ##f(x)## i.e. that:

$$\frac{dA}{dx} = f(x)$$

This is intuitively "obvious" - we form ##A(x)## by imagining a vertical line sweeping out area from ##x=0## and moving in the +ve x direction at a constant rate. When this line is at a place where ##f(x)## is large, then a small movement will increase ##A(x)## by a large amount (i.e. ##dA/dx## is large), and when ##f(x)## is small, then a small movement will increase ##A(x)## by a small amount (i.e. ##dA/dx## is small). In particular, if, say, ##f(a) = 2f(b)##, then when we sweep out a small amount of area near ##f(a)##, then ##A(x)## increases by twice the amount compared to the same action at ##f(b)## - so we must at least have:

$$\frac{dA}{dx} \propto f(x)$$

You need to embed this idea in your mind, by visualising it. Try starting with a piecewise function with $$f(x) = 1, x \in [0,1], f(x)=0, x \in (1, 2], f(x)=2, x \in (2,3]$$ and imagining collecting area with a line sweeping out from 0 towards +ve x. What happens to the gradient of ##A(x)## in the various regions of the domain of ##f(x)##?

So once you know this fact, then you can find areas by anti-differentiating a function. E.g. for ##y=x^2##, we have:

$$\frac{dA}{dx} = x^2 \Rightarrow A = \frac{x^3}{3} +C$$

This is much easier than computing the Riemann sums, and finding the limit. However, to find areas, we need to be able to anti-differentiate. We can only do this by knowing which functions differentiate to what though - there's no algorithm to anti-differentiate, and indeed some functions (e.g. ##e^{x^2}##) don't have anti-derivatives in a nice, closed form version of other elementary functions (though you can resort to Taylor series here, if necessary, and integrate them term-by-term).

Note also that areas below the x-axis are negative, since ##f(x) < 0## there. So an integral and an area in the usual sense do not necessarily have the same value (e.g. you can see that ##\int_0^{2\pi} \sin x dx = 0## since there is as much +ve area as -ve area, but you'd usually want the total area to be some non-zero value)