Almost sure convergance of sum of rv

  • Context: Graduate 
  • Thread starter Thread starter logarithmic
  • Start date Start date
  • Tags Tags
    Sum
Click For Summary

Discussion Overview

The discussion revolves around the almost sure convergence of the sum of independent random variables with zero mean and finite second moments. Participants explore the conditions under which such convergence can be established, particularly focusing on the relationship between moments, moment generating functions (MGFs), and convergence types.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks to prove that if \{X_n\} are independent random variables with E(X_n)=0 and \sum_{n}E(X_n^2) <\infty, then \sum_{n}X_n converges almost surely, but struggles to show the necessary conditions for convergence.
  • Another participant suggests that if the moments converge, the distribution represented by those moments could lead to a unique random variable, implying potential convergence.
  • There is a discussion about the role of MGFs in establishing convergence, with some participants questioning their relevance to almost sure convergence as opposed to convergence in distribution.
  • One participant expresses skepticism about the term "almost sure convergence," questioning its mathematical validity and seeking clarification on its definition.
  • Another participant argues that proving equality of moments is stronger than proving limiting cases, suggesting that the existence of a valid MGF could lead to a unique probability density function (PDF) and thus convergence.

Areas of Agreement / Disagreement

Participants express differing views on the relevance of MGFs and moments to almost sure convergence. There is no consensus on whether the convergence of moments directly implies almost sure convergence, and the definition of almost sure convergence itself is debated.

Contextual Notes

Some participants reference specific theorems, such as Levy's continuity theorem, but there is uncertainty about how these relate to almost sure convergence versus convergence in distribution. The discussion reflects a range of interpretations and assumptions regarding the mathematical concepts involved.

logarithmic
Messages
103
Reaction score
0
I'm trying to prove that if \{X_n\} is independent and E(X_n)=0 for all n, and \sum_{n}E(X_n^2) &lt;\infty, then \sum_{n}X_n converges almost surely.

What I've got so far is the following: Denote the partial sums by \{S_n\}, then proving almost sure convergence is equivalent to showing that there exists a random variable, X, such that for all \epsilon &gt; 0, \sum_{n}P(|S_n-X|&gt;\epsilon)&lt;\infty. Using the Markov inequality gives \sum_{n}P(|S_n-X|&gt;\epsilon)\leq\sum_{n}E(|S_n-X|)/\epsilon. Then we just need to show that this sum converges. But I can't find anyway to do this.

It's easy to show that \sum_{n}X_n converges in the L^2 sense, so it also converges in the L^1 sense, which means that \lim_{n\to\infty}E(|S_n-X|)=0, but this isn't strong enough to imply that \sum_{n}E(|S_n-X|)&lt;\infty, which is what I need.

Can anyone help me with this?
 
Physics news on Phys.org
Hey logarithmic.

Just for clarification is the thing you are converging to a random variable? (I'm guessing it is).

If this is the case, then if you show that the moments converge to some particular value, then the distribution itself that is represented by those moments is the analogous to the resulting random variable.

If they are independent, then the MGF of the final random variable is the sum of all the MGF's and if this defines a valid unique random variable, then you're done. As long as the random variable is a valid random variable, then this is what matters.

You will not in general get convergence for a sum of random variables like that to converge to some single value (like you would for say a consistent estimator for say a parameter or a mean).

The other thing is that using the expectation is not an indicator of whether the value converges. For example a Cauchy distribution doesn't have a finite mean but it definitely has values at which it can converge with a non-zero probability (intervals I mean).

At a more abstract level, you could even through an analysis argument, that if the MGF is bounded by some function with the exponent of the nth power (n random variables) that shows that a unique PDF exists with only finite values that have non-zero probability, then you have shown that the random variable converges for all possible probabilities.

This kind of converges shows convergence to probability and there are many ways to show this, but effectively they are all about showing that a particular distribution exists.

It might also be useful to consider that the variance of the final random variable is finite and how this affects the nature of the final distribution.
 
Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if E(X_n^p)\to E(X^p) for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

Is there some result that relates MGFs or moments to almost sure convergence?
 
logarithmic said:
Yes, X is a random variable, not a number. I'm not sure if looking at the MGF will help though, as the only results I know about MGF and convergence (e.g. Levy's continuity theorem, and the theorem that if E(X_n^p)\to E(X^p) for all p, then we have convergence in distribution) are about convergence in distribution, not almost sure convergence.

Is there some result that relates MGFs or moments to almost sure convergence?

If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.
 
chiro said:
If you show the moments converge you show the PDF converges: the MGF is unique for a distribution just like a Laplace or Fourier transform of a valid function is unique for that function.
If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.
 
logarithmic said:
If the PDF converges, that's convergence in distribution, which is weaker than almost sure convergence.

Ok sorry: what's the definition of almost sure convergence? Kinda sounds made up (i.e. converges "almost' surely is not what I'd expect a mathematician to say) and I know it's not your definition either ;).
 
I don't know how that can be weaker: proving the case of equality is even stronger than proving the limiting case.

Besides: With Levy's theorem that you stated, if you can use that then you're done since all the moments of a distribution uniquely define the distribution. Proving that all moments converge to a particular value means you have proven that the distribution is represented by the definition and values of those moments.

If you want to prove the above, you use the definition of the MGF and show that if the moments exist and are finite, then the MGF exists and is valid and if that's valid then you can get a unique PDF from an Fourier transform that represents the distribution representing by those moments that Levy's theorem proved.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K