Almost sure convergance of sum of rv

  • Thread starter logarithmic
  • Start date
  • Tags
    Sum
In summary, almost sure convergence is a concept in probability theory that states that the sum of a sequence of random variables will converge to a single value with a probability of 1. This means that as the number of random variables increases, the sum will approach a definite limit almost surely. It is an important concept in understanding the behavior of random variables and their sums.
  • #1
logarithmic
107
0
I'm trying to prove that if [itex]\{X_n\}[/itex] is independent and [itex]E(X_n)=0[/itex] for all n, and [itex]\sum_{n}E(X_n^2) <\infty[/itex], then [itex]\sum_{n}X_n[/itex] converges almost surely.

What I've got so far is the following: Denote the partial sums by [itex]\{S_n\}[/itex], then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all [itex]\epsilon > 0[/itex], [itex]\sum_{n}P(|S_n-X|>\epsilon)<\infty[/itex]. Using the Markov inequality gives [itex]\sum_{n}P(|S_n-X|>\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon[/itex]. Then we just need to show that this sum converges. But I can't find anyway to do this.

It's easy to show that [itex]\sum_{n}X_n[/itex] converges in the [itex]L^2[/itex] sense, so it also converges in the [itex]L^1[/itex] sense, which means that [itex]\lim_{n\to\infty}E(|S_n-X|)=0[/itex], but this isn't strong enough to imply that [itex]\sum_{n}E(|S_n-X|)<\infty[/itex], which is what I need.

Can anyone help me with this?
 
Physics news on Phys.org
  • #2
Hey logarithmic.

Just for clarification is the thing you are converging to a random variable? (I'm guessing it is).

If this is the case, then if you show that the moments converge to some particular value, then the distribution itself that is represented by those moments is the analogous to the resulting random variable.

If they are independent, then the MGF of the final random variable is the sum of all the MGF's and if this defines a valid unique random variable, then you're done. As long as the random variable is a valid random variable, then this is what matters.

You will not in general get convergence for a sum of random variables like that to converge to some single value (like you would for say a consistent estimator for say a parameter or a mean).

The other thing is that using the expectation is not an indicator of whether the value converges. For example a Cauchy distribution doesn't have a finite mean but it definitely has values at which it can converge with a non-zero probability (intervals I mean).

At a more abstract level, you could even through an analysis argument, that if the MGF is bounded by some function with the exponent of the nth power (n random variables) that shows that a unique PDF exists with only finite values that have non-zero probability, then you have shown that the random variable converges for all possible probabilities.

This kind of converges shows convergence to probability and there are many ways to show this, but effectively they are all about showing that a particular distribution exists.

It might also be useful to consider that the variance of the final random variable is finite and how this affects the nature of the final distribution.
 
  • #3
Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if [itex]E(X_n^p)\to E(X^p)[/itex] for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

Is there some result that relates MGFs or moments to almost sure convergence?
 
  • #4
logarithmic said:
Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if [itex]E(X_n^p)\to E(X^p)[/itex] for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

Is there some result that relates MGFs or moments to almost sure convergence?

If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.
 
  • #5
chiro said:
If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.
If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.
 
  • #6
logarithmic said:
If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.

Ok sorry: what's the definition of almost sure convergence? Kinda sounds made up (i.e. converges "almost' surely is not what I'd expect a mathematician to say) and I know it's not your definition either ;).
 
  • #8
I don't know how that can be weaker: proving the case of equality is even stronger than proving the limiting case.

Besides: With Levy's theorem that you stated, if you can use that then you're done since all the moments of a distribution uniquely define the distribution. Proving that all moments converge to a particular value means you have proven that the distribution is represented by the definition and values of those moments.

If you want to prove the above, you use the definition of the MGF and show that if the moments exist and are finite, then the MGF exists and is valid and if that's valid then you can get a unique PDF from an Fourier transform that represents the distribution representing by those moments that Levy's theorem proved.
 

1. What is "almost sure convergence" of a sum of random variables?

"Almost sure convergence" refers to the concept in probability theory where the sum of a sequence of random variables converges to a specific value with a probability of 1. This means that as the number of random variables in the sequence increases, the probability of the sum converging to a certain value also increases.

2. How is "almost sure convergence" different from other types of convergence?

"Almost sure convergence" is a stronger form of convergence compared to other types, such as convergence in probability or convergence in distribution. It guarantees that the sum of random variables will converge to a specific value with a probability of 1, rather than just a high probability or in distribution.

3. What is the significance of "almost sure convergence" in statistics and probability?

"Almost sure convergence" is an important concept in statistics and probability because it allows us to make more precise and confident predictions about the behavior of a sequence of random variables. It also has practical applications in fields such as finance, where accurate predictions of the behavior of a series of random events are crucial.

4. How is "almost sure convergence" related to the law of large numbers?

"Almost sure convergence" is a key component of the law of large numbers, which states that as the number of trials in a random experiment increases, the average of the outcomes will converge to the expected value. The law of large numbers relies on "almost sure convergence" to ensure that the sum of the random variables will converge to a specific value.

5. Are there any limitations or assumptions to consider when using "almost sure convergence"?

One limitation of "almost sure convergence" is that it requires the random variables in the sequence to be independent and identically distributed (i.i.d.). In addition, the concept may not apply to every sequence of random variables, so it is important to carefully consider the properties of the sequence before applying it.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
742
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
Replies
0
Views
352
Replies
2
Views
1K
Back
Top