Convergence in measure vs Almost surely convergence

  • Context: Graduate 
  • Thread starter Thread starter cappadonza
  • Start date Start date
  • Tags Tags
    Convergence Measure
Click For Summary
SUMMARY

This discussion clarifies the distinction between convergence in probability and almost surely convergence of sequences of random variables. Almost surely convergence implies that a sequence converges for all values except on a set of zero measure, while convergence in probability focuses on the measure of the set of points where the sequence differs from the limit. The discussion highlights that almost surely convergence does imply convergence in probability, but the reverse is not true, as demonstrated through counterexamples. Key references include formal definitions and examples from probability theory and real analysis.

PREREQUISITES
  • Understanding of random variables and their properties
  • Familiarity with concepts of convergence in real analysis
  • Knowledge of measure theory and probability measures
  • Basic grasp of sequences and limits in mathematical analysis
NEXT STEPS
  • Study the formal definition of convergence in probability as outlined in probability theory
  • Explore the implications of almost surely convergence on random variables
  • Review counterexamples that illustrate the differences between the two types of convergence
  • Investigate the relationship between convergence in measure and convergence in probability
USEFUL FOR

Mathematicians, statisticians, and students of probability theory seeking to deepen their understanding of convergence concepts in random variables.

cappadonza
Messages
26
Reaction score
0
Hi all
I am struggling to see the difference between Convergence in probability and Almost surely convergence of a sequence of random variables.
From what i can see Almost surely convergence of Sequence of Random variables is very similar to pointwise convegence from Real analysis.

I am struggling to see why almost surely convergence is different to convergence in probability.
more so why does almost surely convegence imply convergence in probabililty where as the converse is not true. I see some counter examples but still don't grasp the concept

am i confusing myself by thinking convergence in proability vs almost surely convergence is analogous to pointwise convergence vs uniform convergence of sequence of functions ?

sorry if this question is not clear, I'm kind of lost here.

any pointers would we much appreciated
 
Last edited:
Physics news on Phys.org
cappadonza said:
Hi all I am struggling to see why almost surely convergence is different to convergence in probability.
more so why does almost surely convegence imply convergence in probabililty where as the converse is not true. I see some counter examples but still don't grasp the concept

In real analysis convergence "almost everywhere" implies holding for all values except on a set of zero measure.

In probability theory, "almost everywhere" takes randomness into account such that for a large sequence of realizations of some random variable X over a population P, the mean value of X will fail to converge to the population mean of P with probability 0. This does not mean that the event is impossible; just that it happens so rarely that it is not possible to assign any non-zero probability to its occurrence.

Here's a more formal treatment of the subject:

http://www.stat.tamu.edu/~suhasini/teaching673/asymptotics.pdf
 
Last edited:
Thanks
i understand almost surely convergence, i still don't understand convergence in probability .
 
cappadonza said:
Thanks
i understand almost surely convergence, i still don't understand convergence in probability .

Convergence of a probability, as opposed to almost sure convergence, is not used much in probability theory although it can be defined:

\{\omega\in\Omega|lim_{n\rightarrow\infty} X_{n}(\omega)=X\}=\Omega

Since it is defined in terms of the sample space \Omega, the issue of sets with probability zero is not, it seems, relevant.

http://en.wikipedia.org/wiki/Convergence_of_random_variables
 
Last edited:
I suppose you know the definition of convergence in probability (or, in general, in measure) of a sequence {f_n} of functions:

{f_n} converges in measure to f if for any \epsilon and \delta \mu\left\{|f_n-f|>\delta\right\}<\epsilon for n large enough.

So, attention is not focused on any single point, all that matters is the measure of the set of points which differ significatively from f, even if the sequence does not converges anywhere!

For example, consider a sequence of rectangular-pulse shaped functions the supports of which have measure (probability) 1/n, each pulse function being located randomly inside [0,1]. The sequence does not converge on any single point (because randomness hypothesis implies that any point will be in the support of some pulse for infinite n), however the important fact is that the measure (probability) of the support where they differ from the 0 function goes to cero. So the sequence converge in probability to 0.

Keep I am mind: for convergence in probability all that matter is the measure (probability) of points where the sequence differ from the limit, not the points themselves.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
6K