B Integration by Substitution Using Infinite Sums

Conductivity
Messages
86
Reaction score
4
I have seen the wikipedia's proof which can be found here: https://proofwiki.org/wiki/Integration_by_Substitution
However sometimes, we have problems where you have a ##d(x)## times ## f(g(x))## times g prime of x where we use substitution and it works but the proof didn't prove this condition..

I was wondering if you can prove why it works through infinite sums like for example
##( f(g(x)) g^{'}(x) dx + f(g(x+dx)) g^{'}(x+dx) dx + ... ) ##
If I can change that to this
##( f(y) dy + f(y+dy) dy + ... ## where you set y = g(x)
It would statisfy me.. But is it possible to prove it using this?
 
Physics news on Phys.org
Substitution works because the chain rule for derivative works:

## [f(g(x))]'=f'(g(x))g'(x) ##

if you integrate both sides, you will obtain

## \int [f(g(x))]' dx=\int f'(g(x))g'(x) dx ##

that is the same of ##\int f'(g(x))g'(x) dx = f(g(x)) + c ## because the integral is the antiderivation so ## \int [f(g(x))]' dx=f(g(x))+c##. This formally can be done substituting in the integral ##\int f'(g(x))g'(x) dx## the function ## g(x)=t## so ##g'(x)dx=dt## (we do a substitution on differentials) differentiating and ##\int f'(g(x))g'(x) dx=\int f'(t)dt=f(t)+c##, returning in ##g## we have: ##\int f'(g(x))g'(x) dx=f(g(x))+c##.

Ssnow
 
Ssnow said:
Substitution works because the chain rule for derivative works:

## [f(g(x))]'=f'(g(x))g'(x) ##

if you integrate both sides, you will obtain

## \int [f(g(x))]' dx=\int f'(g(x))g'(x) dx ##

that is the same of ##\int f'(g(x))g'(x) dx = f(g(x)) + c ## because the integral is the antiderivation so ## \int [f(g(x))]' dx=f(g(x))+c##. This formally can be done substituting in the integral ##\int f'(g(x))g'(x) dx## the function ## g(x)=t## so ##g'(x)dx=dt## (we do a substitution on differentials) differentiating and ##\int f'(g(x))g'(x) dx=\int f'(t)dt=f(t)+c##, returning in ##g## we have: ##\int f'(g(x))g'(x) dx=f(g(x))+c##.

Ssnow
Oh, Can I think of it as this?
## [f(g(x))]'=f'(g(x))g'(x) ##
This is just df/dx So If I make a variable called u= g(x) and then du/dx = g(x) now flipping this over gives you dx/du = 1/g(x) and we can then multiply it to df/dx to get df/du by chain rule and then we integrate with respect to u.

What bothers me is: Why is dx substitute-able rather than a notation to let us know what we are integrating in respect to?
I know that it refers to something infinitesimal and the integration just means the infinite sums of f(x) multiplied by dx which would make sense in these cases
du/dx = 1/u
u du = 1 dx
##\int u du = \int 1 dx ##

Now I can apply the same logic to the substitution rule. I can assign a value u and say dx = du/g'(x) which then substitute it and integrate to get the infinite sum of ## \int f(u) du ## But I still prefer the first argument.

Excuse me if I am talking nonsense just getting started at integration
 
Conductivity said:
Oh, Can I think of it as this?
[f(g(x))]′=f′(g(x))g′(x) [f(g(x))]'=f'(g(x))g'(x)
This is just df/dx So If I make a variable called u= g(x) and then du/dx = g(x) now flipping this over gives you dx/du = 1/g(x) and we can then multiply it to df/dx to get df/du by chain rule and then we integrate with respect to u.

Yes.

Conductivity said:
Why is dx substitute-able rather than a notation to let us know what we are integrating in respect to?

I don't know if can help but the fact is that the presence of derivative inside the integral permit you to substitute ''the differential'', that is the espression ##g'(x)dx## (that you can see as an ''incremental quantity'' ) with another much simpler as ##du##. Consequence of this: your integral will be simpler then the first (this is similar to the integration by part rule)

Conductivity said:
Excuse me if I am talking nonsense just getting started at integration

No absolutely, your talk is perfectly reasonable :smile:.

Ssnow
 
Back
Top