## Dominated Convergence theorem

hello all

I have been researching about the dominated convergence theorem so that i could use it to prove the relationship between the gamma function and the Riemann zeta function, what i need is the simple version of the dominated convergence theorem to be able to swap the summation sign and the integral sign around in order to prove the relationship, after doing some research i have only found information relating to the measure theory but i dont understand how that relates to how im going to use it, anybody have any ideas
 PhysOrg.com science news on PhysOrg.com >> Galaxies fed by funnels of fuel>> The better to see you with: Scientists build record-setting metamaterial flat lens>> Google eyes emerging markets networks
 Recognitions: Homework Help Science Advisor Are you trying to prove something like $\sum_{n=1}^{\infty}\int_{0}^{\infty}x^{s-1}e^{-nx}dx=\int_{0}^{\infty}x^{s-1}/(e^x-1)dx?$ In this case your sequence of functions is bounded in absolute value by $$x^{\sigma-1}/(e^x-1)$$, where $$\sigma$$ is the real part of s and your sequence of functions is uniformly convergent on compact sets, so you should be able to apply even the strictest versions of dominated convergence (of course we're also assuming $$\sigma>1$$ here). Or are you having trouble finding a statement for the Riemann integral? If a function is Riemann integrable, it is Lebesgue integrable and these integrals are equal, so you can translate a lebesgue theorem to a riemann one (provided you meet all hypotheses of course). In any case, the following is sufficient here (this is from Rudin but should be in most analysis texts in some form): $$g,\ f_n$$ Riemann integrable on [t,T] for all $$0  well yes im trying to prove this $\sum_{n=1}^{\infty}\int_{0}^{\infty}x^{s-1}e^{-nx}dx=\int_{0}^{\infty}x^{s-1}/(e^x-1)dx?$ what im trying to do is look for a statement relating to the dominated convergence theorem which allows me to interchange the summation sign and the integral sign, like are there certain conditions that are to be met to do such a step, or is it just a statement which needs to be stated in the proof, i think an explanation in simple terms on what this theorem means and how it relates to proving this problem would help, in terms of [tex]\sigma$$ i didnt understand it, especially if we only take the real part of s how do we know that it would be bounded in absolute value?

Recognitions:
Homework Help

## Dominated Convergence theorem

 Quote by steven187 what im trying to do is look for a statement relating to the dominated convergence theorem which allows me to interchange the summation sign and the integral sign,
Then look at the statement I provided in my last post.

 Quote by steven187 like are there certain conditions that are to be met to do such a step, or is it just a statement which needs to be stated in the proof,
If you want to apply any theorem there are always conditions that have to be met first.

 Quote by steven187 i think an explanation in simple terms on what this theorem means and how it relates to proving this problem would help,
Well it gives sufficient conditions on when you can say the limit of the integrals of a sequence of functions is equal to the integral of the limit of those functions, i.e. the swapping of the limit with the integral sign. For this problem your infinite sum is hiding the limit, the $$f_n$$ in the theorem is just the nth partial sum, $f_n(x)=\sum_{k=0}^{n}x^{s-1}e^{-kx}$, f is the limit of these sums, $$x^{s-1}/(e^x-1)$$ and here we can take g to be $$x^{\sigma-1}/(e^x-1)$$ by the triangle inequality and what follows below. We have uniform convergence compact sets to the right of real part s equals 1. So the theorem applies and the swapping of the infinite sum and the integral is justified.

 Quote by steven187 in terms of $$\sigma$$ i didnt understand it, especially if we only take the real part of s how do we know that it would be bounded in absolute value?
You are familiar with complex exponents, right? If $$s=\sigma+it$$ where $$\sigma$$ and t are real then $$x^{s-1}=x^{\sigma}x^{it}=x^{\sigma-1}e^{it\log{x}}=x^{\sigma-1}(\cos{(t\log{x})}+i\sin{(t\log{x})})$$, so $$|x^{-1}s|=|x^{\sigma-1}|=x^{\sigma-1}$$. This shouldn't be new to you. (remember x>0 here)