Dominated Convergence theorem

steven187
Messages
176
Reaction score
0
hello all

I have been researching about the dominated convergence theorem so that i could use it to prove the relationship between the gamma function and the Riemann zeta function, what i need is the simple version of the dominated convergence theorem to be able to swap the summation sign and the integral sign around in order to prove the relationship, after doing some research i have only found information relating to the measure theory but i don't understand how that relates to how I am going to use it, anybody have any ideas
 
Physics news on Phys.org
Are you trying to prove something like \sum_{n=1}^{\infty}\int_{0}^{\infty}x^{s-1}e^{-nx}dx=\int_{0}^{\infty}x^{s-1}/(e^x-1)dx?

In this case your sequence of functions is bounded in absolute value by x^{\sigma-1}/(e^x-1), where \sigma is the real part of s and your sequence of functions is uniformly convergent on compact sets, so you should be able to apply even the strictest versions of dominated convergence (of course we're also assuming \sigma>1 here).

Or are you having trouble finding a statement for the Riemann integral? If a function is Riemann integrable, it is Lebesgue integrable and these integrals are equal, so you can translate a lebesgue theorem to a riemann one (provided you meet all hypotheses of course). In any case, the following is sufficient here (this is from Rudin but should be in most analysis texts in some form):

g,\ f_n Riemann integrable on [t,T] for all 0<t<T<\infty, |f_n|<g, f_n converges to f uniformly on compact subsets and \int_{0}^{\infty}g(x)dx is finite (and exists), then \lim_{n\rightarrow\infty}\int_{0}^{\infty}f_n(x)dx=\int_{0}^{\infty}f(x)dx
 
well yes I am trying to prove this

\sum_{n=1}^{\infty}\int_{0}^{\infty}x^{s-1}e^{-nx}dx=\int_{0}^{\infty}x^{s-1}/(e^x-1)dx?

what I am trying to do is look for a statement relating to the dominated convergence theorem which allows me to interchange the summation sign and the integral sign, like are there certain conditions that are to be met to do such a step, or is it just a statement which needs to be stated in the proof, i think an explanation in simple terms on what this theorem means and how it relates to proving this problem would help, in terms of
\sigma i didnt understand it, especially if we only take the real part of s how do we know that it would be bounded in absolute value?
 
steven187 said:
what I am trying to do is look for a statement relating to the dominated convergence theorem which allows me to interchange the summation sign and the integral sign,

Then look at the statement I provided in my last post.

steven187 said:
like are there certain conditions that are to be met to do such a step, or is it just a statement which needs to be stated in the proof,

If you want to apply any theorem there are always conditions that have to be met first.

steven187 said:
i think an explanation in simple terms on what this theorem means and how it relates to proving this problem would help,

Well it gives sufficient conditions on when you can say the limit of the integrals of a sequence of functions is equal to the integral of the limit of those functions, i.e. the swapping of the limit with the integral sign. For this problem your infinite sum is hiding the limit, the f_n in the theorem is just the nth partial sum, f_n(x)=\sum_{k=0}^{n}x^{s-1}e^{-kx}, f is the limit of these sums, x^{s-1}/(e^x-1) and here we can take g to be x^{\sigma-1}/(e^x-1) by the triangle inequality and what follows below. We have uniform convergence compact sets to the right of real part s equals 1. So the theorem applies and the swapping of the infinite sum and the integral is justified.

steven187 said:
in terms of \sigma i didnt understand it, especially if we only take the real part of s how do we know that it would be bounded in absolute value?

You are familiar with complex exponents, right? If s=\sigma+it where \sigma and t are real then x^{s-1}=x^{\sigma}x^{it}=x^{\sigma-1}e^{it\log{x}}=x^{\sigma-1}(\cos{(t\log{x})}+i\sin{(t\log{x})}), so |x^{-1}s|=|x^{\sigma-1}|=x^{\sigma-1}. This shouldn't be new to you. (remember x>0 here)
 
Back
Top