I Derivative of a definite integral?

Terrell
Messages
316
Reaction score
26
consider x is between the interval [a,b]
would it be correct to say that the derivative of a definite integral F(x) is f(x) because as dx approaches zero in (x + dx), the width of ALL "imaginary rectangles" would closely resemble a line segment which approximates f(x)? therefore change in area under a curve is dependent to the change in the height of f(x) with respect to dx(which is inifinitesimally small)??

the different notations used in several videos i watched seemed to have confused me or doubt my own understanding of a seemingly simple concept
 
  • Like
Likes Ahmed Mehedi
Physics news on Phys.org
Derivative of a definite integral? The definite integral calculates an orientated area. This is a constant. The derivative of a constant equals zero. Therefor, the derivative of a definite integral is zero.
 
  • Like
Likes Ahmed Mehedi
Math_QED said:
Derivative of a definite integral? The definite integral calculates an orientated area. This is a constant. The derivative of a constant equals zero. Therefor, the derivative of a definite integral is zero.
sorry. just integral, not definite integral
 
Math_QED said:
Derivative of a definite integral? The definite integral calculates an orientated area. This is a constant. The derivative of a constant equals zero. Therefor, the derivative of a definite integral is zero.
Not necessarily.

You have to apply the Leibniz Rule:

https://en.wikipedia.org/wiki/Leibniz_integral_rule
 
  • Like
Likes Ahmed Mehedi and member 587159
Math_QED said:
Derivative of a definite integral? The definite integral calculates an orientated area. This is a constant. The derivative of a constant equals zero. Therefor, the derivative of a definite integral is zero.

Sorry. I made a mistake. If we simply discard the limit then what happens? That is if $$F(x)=\int f(x)dx$$ implies $$F'(x)=\int f'(x)dx$$?
 
Ahmed Mehedi said:
Sorry. I made a mistake. If we simply discard the limit then what happens? That is if $$F(x)=\int f(x)dx$$ implies $$F'(x)=\int f'(x)dx$$?
No, by definition ##\int f(x) dx## is a function ##F(x)## satisfying ##F'(x) = f(x)##. So rather ##F'(x) = f(x)##.

What you wrote is not entirely false, but then you have to assume that ##f## is differentiable which must not be the case. Also you must take into account annoying (integrating) constants.
 
  • Like
Likes Ahmed Mehedi
Math_QED said:
No, by definition ##\int f(x) dx## is a function ##F(x)## satisfying ##F'(x) = f(x)##. So rather ##F'(x) = f(x)##.

What you wrote is not entirely false, but then you have to assume that ##f## is differentiable which must not be the case. Also you must take into account annoying (integrating) constants.

Can't we just differentiate both sides of the first line and pass the differentiation operator inside the integration operator?
 
Math_QED said:
No, by definition ##\int f(x) dx## is a function ##F(x)## satisfying ##F'(x) = f(x)##. So rather ##F'(x) = f(x)##.

What you wrote is not entirely false, but then you have to assume that ##f## is differentiable which must not be the case. Also you must take into account annoying (integrating) constants.

I got your answer. But why you assume that f is not differentiable?
 
Ahmed Mehedi said:
I got your answer. But why you assume that f is not differentiable?

##F## can be a primitive function of ##f## without ##f## being differentiable. For example, take ##f(x)=|x|##. This is not differentiable in ##0## but by the fundamental theorem of calculus there exists a primitive ##F## for ##f##.
 
  • Like
Likes Ahmed Mehedi
Back
Top