# Almost sure convergance of sum of rv

1. Aug 14, 2012

### logarithmic

I'm trying to prove that if $\{X_n\}$ is independent and $E(X_n)=0$ for all n, and $\sum_{n}E(X_n^2) <\infty$, then $\sum_{n}X_n$ converges almost surely.

What I've got so far is the following: Denote the partial sums by $\{S_n\}$, then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all $\epsilon > 0$, $\sum_{n}P(|S_n-X|>\epsilon)<\infty$. Using the Markov inequality gives $\sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon$. Then we just need to show that this sum converges. But I can't find anyway to do this.

It's easy to show that $\sum_{n}X_n$ converges in the $L^2$ sense, so it also converges in the $L^1$ sense, which means that $\lim_{n\to\infty}E(|S_n-X|)=0$, but this isn't strong enough to imply that $\sum_{n}E(|S_n-X|)<\infty$, which is what I need.

Can anyone help me with this?

2. Aug 14, 2012

### chiro

Hey logarithmic.

Just for clarification is the thing you are converging to a random variable? (I'm guessing it is).

If this is the case, then if you show that the moments converge to some particular value, then the distribution itself that is represented by those moments is the analogous to the resulting random variable.

If they are independent, then the MGF of the final random variable is the sum of all the MGF's and if this defines a valid unique random variable, then you're done. As long as the random variable is a valid random variable, then this is what matters.

You will not in general get convergence for a sum of random variables like that to converge to some single value (like you would for say a consistent estimator for say a parameter or a mean).

The other thing is that using the expectation is not an indicator of whether the value converges. For example a Cauchy distribution doesn't have a finite mean but it definitely has values at which it can converge with a non-zero probability (intervals I mean).

At a more abstract level, you could even through an analysis argument, that if the MGF is bounded by some function with the exponent of the nth power (n random variables) that shows that a unique PDF exists with only finite values that have non-zero probability, then you have shown that the random variable converges for all possible probabilities.

This kind of converges shows convergence to probability and there are many ways to show this, but effectively they are all about showing that a particular distribution exists.

It might also be useful to consider that the variance of the final random variable is finite and how this affects the nature of the final distribution.

3. Aug 14, 2012

### logarithmic

Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if $E(X_n^p)\to E(X^p)$ for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

Is there some result that relates MGFs or moments to almost sure convergence?

4. Aug 14, 2012

### chiro

If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.

5. Aug 14, 2012

### logarithmic

If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.

6. Aug 14, 2012

### chiro

Ok sorry: what's the definition of almost sure convergence? Kinda sounds made up (i.e. converges "almost' surely is not what I'd expect a mathematician to say) and I know it's not your definition either ;).

7. Aug 14, 2012

### logarithmic

8. Aug 14, 2012

### chiro

I don't know how that can be weaker: proving the case of equality is even stronger than proving the limiting case.

Besides: With Levy's theorem that you stated, if you can use that then you're done since all the moments of a distribution uniquely define the distribution. Proving that all moments converge to a particular value means you have proven that the distribution is represented by the definition and values of those moments.

If you want to prove the above, you use the definition of the MGF and show that if the moments exist and are finite, then the MGF exists and is valid and if that's valid then you can get a unique PDF from an Fourier transform that represents the distribution representing by those moments that Levy's theorem proved.