Swapping Integrals and Sums: When is it Justifiable?

member 428835
when using the reimann integral over infinite sums, when is it justifiable to interchange the integral and the sum?

\int\displaystyle\sum_{i=1}^{\infty} f_i(x)dx=\displaystyle\sum_{i=1}^{\infty} \int f_i(x)dx

thanks ahead for the help!
 
Last edited by a moderator:
Physics news on Phys.org
Always.

∫ a+ b dx= ∫ a dx+ ∫ b dx
 
the_wolfman said:
Always.

∫ a+ b dx= ∫ a dx+ ∫ b dx

No, the sums in joshmccraney's post are infinite sums, so you cannot guarantee that exchanging the integrals and sums is valid. Usually the infinite sum needs to be uniformly convergent to swap the sum and integral.
 
Mute said:
Usually the infinite sum needs to be uniformly convergent to swap the sum and integral.

You also need the integral to be proper. It won't hold on (-∞,∞). Wikipedia gives a counterexample.
 
the_wolfman said:
Always.

∫ a+ b dx= ∫ a dx+ ∫ b dx

Here is a counterexample showing that this may not work for infinite sums:

If we define ##g_n(x)## to be a triangular shaped function with base ##0 \leq x \leq 1/n## and height ##2n^2##, then ##\int_{\mathbb{R}}g_n(x) dx = \frac{1}{2}(\textrm{base})(\textrm{height}) = n##. However, ##\lim_{n \rightarrow \infty} g_n(x) = 0## for all ##x##.

Now put ##f_1(x) = g_1(x)##, and ##f_n(x) = g_n(x) - g_{n-1}(x)## for ##n > 1##. We have ##\sum_{n=1}^{N} f_n(x) = g_N(x)##, and so ##\sum_{n=1}^{\infty} f_n(x) = \lim_{N\rightarrow \infty}g_N(x) = 0##. Therefore,
$$\int_{\mathbb{R}} \sum_{n=1}^{\infty}f_n(x) dx = 0$$
On the other hand,
$$\begin{align}
\sum_{n=1}^{\infty} \int_{\mathbb{R}} f_n(x) &= \sum_{n=1}^{\infty} \int_{\mathbb{R}} (g_n(x) - g_{n-1}(x)) dx \\
&= \sum_{n=1}^{\infty} \left( \int_{\mathbb{R}} g_n(x) dx - \int_{\mathbb{R}} g_{n-1}(x) dx \right) \\
&= \sum_{n=1}^{\infty} (n - (n-1))\\
&= \sum_{n=1}^{\infty} (1) = \infty
\end{align}$$
 
Last edited:
Of course the Lebesgue integral is much better behaved. In fact, we use the Lebesgue integral especially to have theorems which allow you to switch integral and limit (or series).

There are two important theorems in Lebesgue theory. Those are the monotone and the dominated convergence theorem.

Monotone convergence states that you can switch series and sum if f_i(x)\geq 0 for all x.
The dominated convergence theorem tells you that you can switch if \sum_{i=0}^{+\infty} |f_i(x)| \leq g(x) for a certain function g such that \int g<+\infty. (We can take g equal to the series of absolute values here)
 
Opps my bad. I agree that your series should be convergent.


Jbunnii in your example is it true that in the limit of N going to infinity that gn = 0 for all x. Naively I'd expect some sort of singular behaviour at x=0.

If I integrate the sum of f_n from n=1 to N, then I'd get N for all finite N. If I then take the limit as N goes to infinity I'd get infinity not zero.
 
the_wolfman said:
Jbunnii in your example is it true that in the limit of N going to infinity that gn = 0 for all x. Naively I'd expect some sort of singular behaviour at x=0.
Let me describe ##g_n(x)## more carefully. Picture an isosceles triangle, whose base is exactly ##[0, 1/n]##, and whose height is ##2n^2##. The coordinates of the three vertices are ##(0, 0)##, ##(1/n, 0)##, and ##(1/(2n), 2n^2)##. I could write a piecewise formula for this but it would just make it more confusing. The function is zero for all ##x \leq 0## and for all ##x \geq 1/n##. The only nonzero portion is in the interval ##(0,1/n)##.

I claim that ##\lim_{n \rightarrow \infty} g_n(x) = 0## for all ##x \in \mathbb{R}##. This is certainly true for all ##x \leq 0## and all ##x \geq 1## because all of the ##g_n## are zero there. All that remains is to check ##(0,1)##. Choose any ##x \in (0,1)##. There exists an ##N## such that ##1/n < x## for all ##n \geq N##. Therefore ##g_n(x) = 0## for all ##n \geq N##, so certainly ##\lim_{n \rightarrow \infty}g_n(x) = 0##. We have shown that this limit is true for all real ##x##.
If I integrate the sum of f_n from n=1 to N, then I'd get N for all finite N. If I then take the limit as N goes to infinity I'd get infinity not zero.
Yes, I don't claim that ##\lim_{n \rightarrow \infty} \int_{\mathbb{R}}g_n(x) dx = 0##. Indeed, this limit is infinite as you said. But the sequence of functions does converge pointwise to zero.
 
Last edited:
the_wolfman said:
Jbunnii in your example is it true that in the limit of N going to infinity that gn = 0 for all x. Naively I'd expect some sort of singular behaviour at x=0.

If I integrate the sum of f_n from n=1 to N, then I'd get N for all finite N. If I then take the limit as N goes to infinity I'd get infinity not zero.

I'm not sure what you are arguing. It's the same as the second half of jbunniii's post.
 
  • #10
To continue this topic a little bit, I thought of a related question which the more rigorous mathematicians may be able to answer. Often in physics we encounter cases in which swapping the order of the integral and sum is invalid, but the result is an asymptotic series rather than absolute nonsense. Does anyone know under what conditions swapping integrals and sums gives a legitimate asymptotic series?
 

Similar threads

Back
Top