Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Almost sure convergance of sum of rv

  1. Aug 14, 2012 #1
    I'm trying to prove that if [itex]\{X_n\}[/itex] is independent and [itex]E(X_n)=0[/itex] for all n, and [itex]\sum_{n}E(X_n^2) <\infty[/itex], then [itex]\sum_{n}X_n[/itex] converges almost surely.

    What I've got so far is the following: Denote the partial sums by [itex]\{S_n\}[/itex], then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all [itex]\epsilon > 0[/itex], [itex]\sum_{n}P(|S_n-X|>\epsilon)<\infty[/itex]. Using the Markov inequality gives [itex]\sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon[/itex]. Then we just need to show that this sum converges. But I can't find anyway to do this.

    It's easy to show that [itex]\sum_{n}X_n[/itex] converges in the [itex]L^2[/itex] sense, so it also converges in the [itex]L^1[/itex] sense, which means that [itex]\lim_{n\to\infty}E(|S_n-X|)=0[/itex], but this isn't strong enough to imply that [itex]\sum_{n}E(|S_n-X|)<\infty[/itex], which is what I need.

    Can anyone help me with this?
     
  2. jcsd
  3. Aug 14, 2012 #2

    chiro

    User Avatar
    Science Advisor

    Hey logarithmic.

    Just for clarification is the thing you are converging to a random variable? (I'm guessing it is).

    If this is the case, then if you show that the moments converge to some particular value, then the distribution itself that is represented by those moments is the analogous to the resulting random variable.

    If they are independent, then the MGF of the final random variable is the sum of all the MGF's and if this defines a valid unique random variable, then you're done. As long as the random variable is a valid random variable, then this is what matters.

    You will not in general get convergence for a sum of random variables like that to converge to some single value (like you would for say a consistent estimator for say a parameter or a mean).

    The other thing is that using the expectation is not an indicator of whether the value converges. For example a Cauchy distribution doesn't have a finite mean but it definitely has values at which it can converge with a non-zero probability (intervals I mean).

    At a more abstract level, you could even through an analysis argument, that if the MGF is bounded by some function with the exponent of the nth power (n random variables) that shows that a unique PDF exists with only finite values that have non-zero probability, then you have shown that the random variable converges for all possible probabilities.

    This kind of converges shows convergence to probability and there are many ways to show this, but effectively they are all about showing that a particular distribution exists.

    It might also be useful to consider that the variance of the final random variable is finite and how this affects the nature of the final distribution.
     
  4. Aug 14, 2012 #3
    Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if [itex]E(X_n^p)\to E(X^p)[/itex] for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

    Is there some result that relates MGFs or moments to almost sure convergence?
     
  5. Aug 14, 2012 #4

    chiro

    User Avatar
    Science Advisor

    If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.
     
  6. Aug 14, 2012 #5
    If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.
     
  7. Aug 14, 2012 #6

    chiro

    User Avatar
    Science Advisor

    Ok sorry: what's the definition of almost sure convergence? Kinda sounds made up (i.e. converges "almost' surely is not what I'd expect a mathematician to say) and I know it's not your definition either ;).
     
  8. Aug 14, 2012 #7
  9. Aug 14, 2012 #8

    chiro

    User Avatar
    Science Advisor

    I don't know how that can be weaker: proving the case of equality is even stronger than proving the limiting case.

    Besides: With Levy's theorem that you stated, if you can use that then you're done since all the moments of a distribution uniquely define the distribution. Proving that all moments converge to a particular value means you have proven that the distribution is represented by the definition and values of those moments.

    If you want to prove the above, you use the definition of the MGF and show that if the moments exist and are finite, then the MGF exists and is valid and if that's valid then you can get a unique PDF from an Fourier transform that represents the distribution representing by those moments that Levy's theorem proved.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Almost sure convergance of sum of rv
Loading...