logarithmic
- 103
- 0
I'm trying to prove that if \{X_n\} is independent and E(X_n)=0 for all n, and \sum_{n}E(X_n^2) <\infty, then \sum_{n}X_n converges almost surely.
What I've got so far is the following: Denote the partial sums by \{S_n\}, then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all \epsilon > 0, \sum_{n}P(|S_n-X|>\epsilon)<\infty. Using the Markov inequality gives \sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon. Then we just need to show that this sum converges. But I can't find anyway to do this.
It's easy to show that \sum_{n}X_n converges in the L^2 sense, so it also converges in the L^1 sense, which means that \lim_{n\to\infty}E(|S_n-X|)=0, but this isn't strong enough to imply that \sum_{n}E(|S_n-X|)<\infty, which is what I need.
Can anyone help me with this?
What I've got so far is the following: Denote the partial sums by \{S_n\}, then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all \epsilon > 0, \sum_{n}P(|S_n-X|>\epsilon)<\infty. Using the Markov inequality gives \sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon. Then we just need to show that this sum converges. But I can't find anyway to do this.
It's easy to show that \sum_{n}X_n converges in the L^2 sense, so it also converges in the L^1 sense, which means that \lim_{n\to\infty}E(|S_n-X|)=0, but this isn't strong enough to imply that \sum_{n}E(|S_n-X|)<\infty, which is what I need.
Can anyone help me with this?