Fundamental theorem of calculus (something isn't right)

Click For Summary
The discussion centers on the Fundamental Theorem of Calculus and its application to continuous functions. It highlights that while the theorem states that if a function f is continuous on an interval, it has an antiderivative F, the continuity of F is crucial for the theorem to hold. A specific example involving f(x) = 1/x and its integral reveals that F(x) = ln|x| is the correct antiderivative, not ln(x), due to discontinuities. Additionally, the example of f(x) = |2x-3| illustrates that while f is continuous, its antiderivative can have points of discontinuity, leading to confusion about the theorem's validity. Ultimately, the discussion emphasizes the importance of ensuring that both f and its antiderivative F are continuous on the specified interval for the theorem to apply correctly.
  • #31
Nope. Anti-differentiation is the reverse process of differentiation but integration is another thing entirely. The fundamental theorem of calculus links integration and anti-differentiation but they are by no means the same thing.
 
Physics news on Phys.org
  • #32
jgens said:
Nope. Anti-differentiation is the reverse process of differentiation but integration is another thing entirely. The fundamental theorem of calculus links integration and anti-differentiation but they are by no means the same thing.

Sorry for going out of the boundaries of this thread, but could you possibly tell me what's the difference?
 
  • #33
njama said:
Sorry for going out of the boundaries of this thread, but could you possibly tell me what's the difference?

Alrighty, in order to avoid any confusion with the integral sign, I'm going to introduce notation that is definitely nonstandard, so don't use it anywhere else. Let D be the differential operator so that D(f) = f' and let D^{-1} denote an anti-differentiation operator where D^{-1}(D(f)) = D^{-1}(f') = f. The integral sign \int will be used solely for the purpose of integration in this example.

With the premliminary information ouf of the way, I suppose that I'll start with anti-differentiation. As the name suggests, this process is simply the reverse of differentiation. Hence, if D(f) = f' then D^{-1}(f') = f and if D^{-1}(f) = F then D(F) = f. Hopefully you can understand this concept fairly easily.

Now, integration on the other hand, is an entirely different process. Suppose that we have the function f which is defined on the closed interval [a,x]. If P = \{t_0, \dots, t_n\} is a partition of [a,x] such that a = t_0 < t_1 < \dots < t_{n-1} < t_n = x then we define the Riemann sum of f for the partition P by R(f,P) = \sum_{i=1}^{n}f(x_i)(t_i - t_{i-1}) where t_{i-1} < x_i < t_i. With this information, the integral of f is given by

\lim_{||P|| \to 0}R(f,P) = \lim_{n \to \infty}R(f,P) = \int_a^xf

Note that D^{-1} and \int have two different definitions. Alright, we're almost done now. What the Fundamental Theorem of Calculus states is that if f is continuous and we define the function F by F = \int_a^xf, then F is differentiable and in particular D(F) = f. Hopefully someone else here can give you a better explanation of everything, but this is the best that I can do with the limited time that I have. Good luck!
 
  • #34
A problem with that idea is that "D^{-1}" is not well defined. Given f(x), there may exist and infinite number of functions, F such that DF= f.
 
  • #35
Halls, I realize that it's poor notation (hence my note about it early on), but it eliminates the need for the integral sign when talking about anti-derivatives. I figured that this might be useful for a student who was struggling with the difference between anti-differentiation and integration. I suppose that I could have given my explanation without introducing some sort of notation for anti-differentiation though. If you have the time to write up an explanation for the OP I'm sure that he/she will appreciate it since you'll do a far better job explaining it then I'll ever do.
 
  • #36
Thanks for the explanation jgens. I partially understand what are you saying to me, but I can't really see the big difference. Suppose [-1,x] is interval on which I need to find the area on. So A(x) would be the area. But it can be seen that \int_{f(x)dx}=A(x) or A'(x)=f(x). I also checked wikipedia.

Here is what I found:

wikipedia said:
In calculus, an antiderivative, primitive or indefinite integral[1] of a function f is a function F whose derivative is equal to f, i.e., F ′ = f. The process of solving for antiderivatives is antidifferentiation (or indefinite integration)
 
  • #37
If you want a simple illustration of how they're different, consider the function f defind by f(x) = 0 if x = 1 and f(x) = 1 otherwise. Now suppose that I want to evaluate \int_0^xf. It's a trivial exercise to show that \int_0^xf = x. Now if integration were the inverse process of differentiation we would have that (x)' = 1 = f(x) which clearly isn't true. Does this make sense?
 
  • #38
If x=1:
<br /> \int_0^xf(t)dt= \int_0^x0dt=0(\int_0^x1dt)+C=C<br />
If x\neq 1
\int_0^x1dt=(t+C_1)|_0^x = x-0=x
It still valid:
F(x)=\left\{\begin{matrix}<br /> C, x=1\\ <br /> x, x \neq 1<br /> \end{matrix}\right.

f(x)=F&#039;(x)=\left\{\begin{matrix}<br /> 0, x=1\\ <br /> 1, x \neq 1<br /> \end{matrix}\right.

Edit: This function does not satisfy the theorem, its not continuous, so it fails to work anyway.
 
  • #39
njama said:
If x=1:
<br /> \int_0^xf(t)dt= \int_0^x0dt=0(\int_0^x1dt)+C=C<br />
If x\neq 1
\int_0^x1dt=(t+C_1)|_0^x = x-0=x
It still valid:
F(x)=\left\{\begin{matrix}<br /> C, x=1\\ <br /> x, x \neq 1<br /> \end{matrix}\right.

f(x)=F&#039;(x)=\left\{\begin{matrix}<br /> 0, x=1\\ <br /> 1, x \neq 1<br /> \end{matrix}\right.

Edit: This function does not satisfy the theorem, its not continuous, so it fails to work anyway.

You cannot just swap f(x) for zero, because the integral "doesn't ask" the function what's its value on a certain point, but what's its value on an interval. Therefore the integral goes through "many" points in which f(x)=1, bumps into a zero, and then goes on. (This is a point where I recommend to read again about integrals, and exercise them, because what you did in the first lines "if x=1 blahblah" show you don't completley understand how to work with them and what do they mean)
But we know that changing a function in a finite number of points, doesn't change its integral, so as far as the integral "knows" he's actually integrating only f(x)=1.

Anti-differentiation, if studied as an operator, must consider any information in f(x), while integration loses this little things that make up f(x).
 
  • #40
njama said:
If x=1:
<br /> \int_0^xf(t)dt= \int_0^x0dt=0(\int_0^x1dt)+C=C<br />
If x\neq 1
\int_0^x1dt=(t+C_1)|_0^x = x-0=x
It still valid:
F(x)=\left\{\begin{matrix}<br /> C, x=1\\ <br /> x, x \neq 1<br /> \end{matrix}\right.

f(x)=F&#039;(x)=\left\{\begin{matrix}<br /> 0, x=1\\ <br /> 1, x \neq 1<br /> \end{matrix}\right.

Edit: This function does not satisfy the theorem, its not continuous, so it fails to work anyway.

No, no, no! This isn't right in the slightest. You can't just integrate using the FTC if f isn't continuous. You need to go back using the definition of the integral like the one that I provided and not what you mistakenly keep taking as the definition of the integral.

Anyway, once you actually do the exercise correctly, you will find that \int_0^xf = x; hence, even though f isn't continuous, this integral is. Now, if integration and anti-differentiation were the same thing, differentiating \int_0^xf would show that f(x) = 1 which is clearly false. Therefore, anti-differentiation and integration are not the exact same thing.
 
  • #41
OK. You defined the function like this:
20p55p2.jpg


And you told me to find:

<br /> \int_0^bf<br />

Now. If the function is continuous on interval [0,b] then it is integrable on [0,b].

But if function is bounded and have finitely many discontinuity on [0,b] it is still integrable.

So, we can find
<br /> \int_0^bf<br />

Lets consider any partition on [0,b]. Then either x_{k}^{*}=0 or it does not.
If not then
\sum_{k=1}^{n}f(x_{k}^{*})\Delta x_{k} = \sum_{k=1}^{n}\Delta x_{k} = b
else
\sum_{k=1}^{n}f(x_{k}^{*})\Delta x_{k} = -\Delta x_{k} + \sum_{k=1}^{n}\Delta x_{k} = b - \Delta x_{k}

which means that the difference between the Riemann sum and b is at most \Delta x_{k}. BUT, since \Delta x_{k} approaches zero as max \Delta x_{k} \rightarrow 0 it follows that:

\int_{0}^{b}f(x)dx = b

But the Fundamental Theorem of Calculus Part 2 states that "If f is continious on an interval [0,b] then f has an antiderivative on [0,b]. In particular if a is any number in [0,b] then the function F defined by:
F(x)=\int_{a}^{x}f(t)dt
is an antiderivative of f on [0,b]; that is F'(x)=f(x) for each x in [0,b], or in an alternative notation:
\frac{d}{dx}\left [ \int_{a}^{x}f(t)dt \right ] = f(x)

So f must be continious so that we can conclude that f has an antiderivative. If f is not continious anything can follow.
 
  • #42
njama said:
So f must be continious so that we can conclude that f has an antiderivative.

Except for the fact that f clearly isn't continuous so the FTC isn't applicable here. The point of this exercise was to illustrate that if F is a function defined by F = \int_0^xf, then F&#039; is not necessarily equal to f as you asserted in post number 30 of this thread.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
1
Views
2K
Replies
9
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K