Intuitive Difference Between Weak And Strong Convergence in Probability

Click For Summary
SUMMARY

The discussion centers on the differences between weak convergence in probability and strong (almost sure) convergence. Weak convergence is defined by the limit taken outside the probability, expressed as \lim_{n \rightarrow \infty} \mathbb{P}\left\{|X_{n}-X|<\epsilon\right\}=1, while strong convergence has the limit inside the probability, represented as \mathbb{P}\left\{\lim_{n \rightarrow \infty} X_{n}=X\right\}=1. The key distinction is that weak convergence allows for the possibility of deviations above epsilon infinitely often, whereas strong convergence guarantees that such deviations do not occur after a certain point. An illustrative example involves an archer's practice, where weak convergence can occur without strong convergence.

PREREQUISITES
  • Understanding of probability theory concepts, specifically convergence types.
  • Familiarity with limit notation in probability, such as \mathbb{P} and \lim.
  • Knowledge of probability density functions (pdf) and their implications in convergence.
  • Basic understanding of sample spaces and events in probability.
NEXT STEPS
  • Study the formal definitions of weak and strong convergence in probability theory.
  • Explore examples of weak convergence that do not exhibit strong convergence.
  • Learn about the implications of convergence types in statistical inference and hypothesis testing.
  • Investigate the role of probability density functions in defining convergence behaviors.
USEFUL FOR

Mathematicians, statisticians, and students of probability theory who seek a deeper understanding of convergence concepts and their applications in statistical analysis.

IniquiTrance
Messages
185
Reaction score
0
I've seen numerous rigourous/conceptual explanations of the difference between convergence in probability (weak), and strong, almost sure convergence.

One explanation my prof gave was that convergence in probability entails:

\lim_{n \rightarrow \infty} \mathbb{P}\left\{|X_{n}-X|&lt;\epsilon\right\}=1

or:\lim_{n \rightarrow \infty} \mathbb{P}\left\{\omega:|X_{n}(\omega)-X(\omega)|&gt;\epsilon\right\}=0

While strong convergence means:

\mathbb{P}\left\{\lim_{n \rightarrow \infty} X_{n}=X\right\}=1

So my prof explains one difference is the limit is taken outside the probability for convergence in probability, while it is inside the probability for almost sure convergence.

Can anyone elaborate on this?

Also, the limits in both forms of convergence seem to imply the same thing.

How is it that saying the probability that |X_{n}-X| is greater than some epsilon, goes to 0 for large n, implies that difference can jump above epsilon, infinitely many times?

And if it does, how is it that saying the probability of it staying below epsilon goes to 1, as n goes to infinity, implies that it CAN'T jump above epsilon EVER, after some n? (And is thus a somehow stronger form of convergence).

Thanks!
 
Physics news on Phys.org
IniquiTrance said:
I've seen numerous rigourous/conceptual explanations of the difference between convergence in probability (weak), and strong, almost sure convergence.

One explanation my prof gave was that convergence in probability entails:

\lim_{n \rightarrow \infty} \mathbb{P}\left\{|X_{n}-X|&lt;\epsilon\right\}=1

While strong convergence means:

\mathbb{P}\left\{\lim_{n \rightarrow \infty} X_{n}=X\right\}=1

So my prof explains one difference is the limit is taken outside the probability for convergence in probability, while it is inside the probability for almost sure convergence.

Can anyone elaborate on this?
Thanks!

My intuitive understanding is that strong convergence of a probability is analogous to sampling real numbers from the interval [0,1], say by Dedekind cuts. Every real number has a uniform probability of zero of being 'drawn' but the sum (density) of probabilities over the interval [0,1] is 1. Even though the probability of any real number is zero, a real number is always chosen by a Dedekind cut on the interval.

Weak convergence is analogous to the probability density of an event under an infinite continuous distribution. An infinite distribution cannot be uniform. Given that some events have a non zero probability density, then for every event, there can be an event with a smaller non zero probability density. Note that the closed interval [0,1] is finitely bounded by, and includes, 0 and 1 while the range of the Gaussian pdf is not finitely bounded.

BTW my intuition based on this example may be too restrictive or even wrong; in which case I'm sure someone will jump in and correct me. I responded because your post has gone unanswered for a while. Essentially, my understanding of strong convergence of a probability is defined in terms of a sample space while almost everywhere convergence is defined in terms of a pdf.
 
Last edited:
IniquiTrance said:
... So my prof explains one difference is the limit is taken outside the probability for convergence in probability, while it is inside the probability for almost sure convergence.

Can anyone elaborate on this?

Also, the limits in both forms of convergence seem to imply the same thing.

One important difference is that the strong limit need not even exist when the weak one does. A neat example is given on the Wikipedia with an archer doing target practice: if X(n)=1 is a hit and X(n)=0 is a miss, then the probability of missing decreases as they practice (weak convergence to X=1) but there is always a non-zero chance of missing (no strong convergence).
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K