Undergrad Convergence in Probability

Click For Summary
SUMMARY

The discussion establishes that for a sequence of real random variables ##\{X_n\}_{n = 1}^\infty## defined on a probability space ##(\Omega, \mathscr{F},\mathbb{P})##, if the expected values converge to a limit ##\mu## and the variances approach zero, then the sequence converges to ##\mu## in probability. This conclusion is derived from the properties of convergence in probability and the implications of diminishing variance. The proof leverages Chebyshev's inequality to demonstrate the relationship between variance and convergence.

PREREQUISITES
  • Understanding of real random variables
  • Familiarity with probability spaces and measures
  • Knowledge of expected value and variance concepts
  • Proficiency in Chebyshev's inequality
NEXT STEPS
  • Study the proof of convergence in probability using Chebyshev's inequality
  • Explore the implications of the Weak Law of Large Numbers
  • Investigate the relationship between convergence in probability and almost sure convergence
  • Learn about different modes of convergence in probability theory
USEFUL FOR

Mathematicians, statisticians, and students studying probability theory, particularly those interested in the convergence properties of random variables.

Euge
Gold Member
MHB
POTW Director
Messages
2,072
Reaction score
245
Prove that if ##\{X_n\}_{n = 1}^\infty## is a sequence of real random variables on probability space ##(\Omega, \mathscr{F},\mathbb{P})## such that ##\lim_n \mathbb{E}[X_n] = \mu## and ##\lim_n \operatorname{Var}[X_n] = 0##, then ##X_n## converges to ##\mu## in probability.
 
Physics news on Phys.org
We're going to use
https://en.m.wikipedia.org/wiki/Chebyshev's_inequality

For any ##m##, there exists ##N## such that for ##n>N##, we have ##P(|X_n-\mu | > 1/m) < 1/m^2##, applying Chebyshev's inequality with ##\sigma < 1/m^3## and ##k=m##, and ##N## picked such that ##var(X_n) < 1/m^3## for ##n>N##.

That's pretty much it, since ##m## it's arbitrary.

For any ##\epsilon >0##, for any ##m## such that ##1/m<\epsilon##, and for ##n## large enough, we have ##P(|X_n-\epsilon) |<1/m##. Since ##m## it's arbitrary if we make ##n## big enough, in the limit as n goes to infinity this goes to zero. Hence all the probability weight must be on ##\mu## as desired.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K