- #1
0000
- 12
- 0
Don't understand why "an indefinite integral is valid only on a interval"
Hi I'm using Stewart's Calculus, in the section of indefinite integral, they say:
"Recall from Theorem 4.10.1 that the most general antiderivative on a given interval is
obtained by adding a constant to a particular antiderivative. We adopt the convention that
when a formula for a general indefinite integral is given, it is valid only on an interval. Thus, we write
[tex]\int[/tex] 1/x^2 dx = - (1/x) + C
with the understanding that it is valid on the interval (0, [tex]\infty[/tex]) or on the interval (-[tex]\infty[/tex], 0). This is true despite the fact that the general antiderivative of the function , f(x)=1/x^2, x[tex]\neq[/tex]0 , is:
F(x):
- (1/x) + C1 if x<0
- (1/x) + C2 if x>0"
Well, I don't understand this convention, don't know if it is something too obvious and I'm complicating myself, like, "there could be points where the indefinite integral isn't defined" or maybe there's a subtle point behind this, maybe they want to say that a indefinite integral that holds for any interval, no matter how small, it's a valid indef. int. of the function.
Please help me to understand this, it seems that it doesn't affect too much the rest of the topics, but I don't like to skip things that I don't understand.
Thank you & excuse me if my english isn't very clear.
Hi I'm using Stewart's Calculus, in the section of indefinite integral, they say:
"Recall from Theorem 4.10.1 that the most general antiderivative on a given interval is
obtained by adding a constant to a particular antiderivative. We adopt the convention that
when a formula for a general indefinite integral is given, it is valid only on an interval. Thus, we write
[tex]\int[/tex] 1/x^2 dx = - (1/x) + C
with the understanding that it is valid on the interval (0, [tex]\infty[/tex]) or on the interval (-[tex]\infty[/tex], 0). This is true despite the fact that the general antiderivative of the function , f(x)=1/x^2, x[tex]\neq[/tex]0 , is:
F(x):
- (1/x) + C1 if x<0
- (1/x) + C2 if x>0"
Well, I don't understand this convention, don't know if it is something too obvious and I'm complicating myself, like, "there could be points where the indefinite integral isn't defined" or maybe there's a subtle point behind this, maybe they want to say that a indefinite integral that holds for any interval, no matter how small, it's a valid indef. int. of the function.
Please help me to understand this, it seems that it doesn't affect too much the rest of the topics, but I don't like to skip things that I don't understand.
Thank you & excuse me if my english isn't very clear.
Homework Statement
Homework Equations
The Attempt at a Solution
Homework Statement
Homework Equations
The Attempt at a Solution
Homework Statement
Homework Equations
The Attempt at a Solution
Last edited: