I'm trying to prove that if [itex]\{X_n\}[/itex] is independent and [itex]E(X_n)=0[/itex] for all n, and [itex]\sum_{n}E(X_n^2) <\infty[/itex], then [itex]\sum_{n}X_n[/itex] converges almost surely.(adsbygoogle = window.adsbygoogle || []).push({});

What I've got so far is the following: Denote the partial sums by [itex]\{S_n\}[/itex], then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all [itex]\epsilon > 0[/itex], [itex]\sum_{n}P(|S_n-X|>\epsilon)<\infty[/itex]. Using the Markov inequality gives [itex]\sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon[/itex]. Then we just need to show that this sum converges. But I can't find anyway to do this.

It's easy to show that [itex]\sum_{n}X_n[/itex] converges in the [itex]L^2[/itex] sense, so it also converges in the [itex]L^1[/itex] sense, which means that [itex]\lim_{n\to\infty}E(|S_n-X|)=0[/itex], but this isn't strong enough to imply that [itex]\sum_{n}E(|S_n-X|)<\infty[/itex], which is what I need.

Can anyone help me with this?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Almost sure convergance of sum of rv

**Physics Forums | Science Articles, Homework Help, Discussion**