Monotone convergence - help required

  • Thread starter Thread starter woundedtiger4
  • Start date Start date
  • Tags Tags
    Convergence
AI Thread Summary
The discussion revolves around the monotone convergence theorem, specifically addressing the apparent contradiction between the integral of a sequence of functions, which equals 1 for all n, and the limit function, which is identically zero almost everywhere. It is explained that the functions in question are normal densities with decreasing variance, which causes them to become narrower and taller as n increases. While each density integrates to 1, the limit of the sequence converges to 0 everywhere except at x=0. Visualizing this concept with a graph can aid in understanding the behavior of the functions as n approaches infinity. The clarification emphasizes the importance of recognizing the properties of normal distributions in this context.
woundedtiger4
Messages
188
Reaction score
0
Hi all,

http://www.scribd.com/doc/100079521/Document-1

Actually, I am trying to learn monotone convergence theorem, and I am stuck at one specific point, on the first page it says that ∫-∞→∞ f_n(x)dx = 1 for every n but the almost everywhere limit function is identically zero, what does it mean? how come the first is equal to 1 and the other is equal to zero?

Thanks in advance.
 
Last edited by a moderator:
Physics news on Phys.org
hi woundedtiger4! :smile:
woundedtiger4 said:
… on the first page it says that ∫-∞→∞ f_n(x)dx = 1 for every n but the almost everywhere limit function is identically zero, what does it mean? how come the first is equal to 1 and the other is equal to zero?

i don't understand why you're asking :confused:

the limit (as n -> ∞) is obviously 0 (everywhere except x = 0)

(and the integral happens to be 1, for all n, though that's less easy to prove)
 
Draw yourself a picture. It might help you see what's going on. You have a sequence of normal densities with decreasing variance. For each n, the density must integrate to 1 because it's a normal density. But as n increases, the density gets narrower (by virtue of decreasing variance) and taller. So the sequence of densities converges to 0 everywhere except at x=0 where it's getting taller with increasing n. This should all be clear algebraically but sometimes a picture helps to clarify the concept.
 
alan2 said:
Draw yourself a picture. It might help you see what's going on. You have a sequence of normal densities with decreasing variance. For each n, the density must integrate to 1 because it's a normal density. But as n increases, the density gets narrower (by virtue of decreasing variance) and taller. So the sequence of densities converges to 0 everywhere except at x=0 where it's getting taller with increasing n. This should all be clear algebraically but sometimes a picture helps to clarify the concept.

thanks a tonne
 
Back
Top