Almost sure convergence & convergence in probability

In summary, "Almost sure convergence" always implies "convergence in probability", but the converse is NOT true. This can be seen by considering a sequence of independent random variables Y_n such that P(Y_n=0)=1-1/n and P(Y_n=1)=1/n. Then Y_n converges in probability to 0, but by Borel-Cantelli's lemma, since ∑ 1/n = ∞ (diverges), Y_n does NOT converge almost surely to 0. Thus, there exists a sequence of random variables Y_n such that Y_n->0 in probability, but Y_n does not converge to 0 almost surely. One possible example of this is a game where the nth
  • #1
kingwinner
1,270
0

Homework Statement


"Almost sure convergence" always implies "convergence in probability", but the converse is NOT true.

Thus, there exists a sequence of random variables Yn such that Yn->0 in probability, but Yn does not converge to 0 almost surely.

Homework Equations


N/A

The Attempt at a Solution


I think this is possible if the Y's are independent, but still I can't think of an concrete example. What is of example of this happening?

Any help is appreciated! :)
 
Physics news on Phys.org
  • #2
Let X_n be a sequence of independent random variables such that
P(X_n=0)=1-1/n and P(X_n=1)=1/n

Then X_n converges in probability to 0.

By Borel-Cantelli's lemma, since

∑ 1/n = ∞ (diverges),
n=1
X_n does NOT converge almost surely to 0.


Borel Cantelli Lemma:
Let A1,A2,A3,... be events.
(i) if ∑P(An)<∞, then P(An io)=0
(ii) if the A's are independent, and ∑P(An)=∞, then P(An io)=1
where P(An io) stands for the probability that an infinite number of the A's occurs.

I don't understand the argument in red, why does Borel Cantelli lemma implies that X_n does NOT converge almost surely to 0? Are we using part (i) or part (ii) of Borel Cantelli lemma?

Can someone please explain in more detail??
Thank you! :)
 
  • #3
If you want an example, think about playing a game where the nth time play you win 1 dollar with probability 1/n. If you play an infinite number of times the expectation value of your winning amount is infinite. Doesn't that mean that almost surely, you won an infinite number of times? That's not a proof of anything, but is that what you are asking?
 
  • Like
Likes WWGD

1. What is the difference between almost sure convergence and convergence in probability?

Almost sure convergence refers to the probability that a sequence of random variables converges to a certain value with a probability of 1. This means that the sequence will converge to the desired value almost all the time. On the other hand, convergence in probability refers to the probability that a sequence of random variables will converge to a desired value with a probability of 0. This means that the sequence may not always converge to the desired value, but the probability of convergence gets closer to 0 as the number of observations increases.

2. How do you determine if a sequence of random variables converges almost surely?

A sequence of random variables can be said to converge almost surely if the probability of the sequence converging to a certain value is equal to 1. This can be determined by using the Borel-Cantelli lemma, which states that if the sum of the probabilities of a sequence of events is infinite, then those events will occur with probability 1.

3. What is the relationship between almost sure convergence and convergence in probability?

Almost sure convergence is a stronger form of convergence than convergence in probability. This means that if a sequence of random variables converges almost surely, then it also converges in probability. However, the converse is not necessarily true. A sequence can converge in probability but not almost surely if the probability of convergence gets closer to 0 as the number of observations increases.

4. Why is almost sure convergence useful in statistical analysis?

Almost sure convergence is useful in statistical analysis because it ensures that the estimates we make from a finite sample will be close to the true value with a probability of 1. This means that as the sample size increases, the estimates will converge to the true value almost all the time, making them more reliable for making inferences and predictions.

5. Can a sequence of random variables converge both almost surely and in probability?

Yes, a sequence of random variables can converge both almost surely and in probability. This is known as strong convergence, which means that the sequence will converge to a certain value almost all the time, and the probability of convergence will also get closer to 0 as the number of observations increases.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Math POTW for Graduate Students
Replies
1
Views
797
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
Back
Top