In light of the comments by Jason Swanson, I am going to update
my first post under this thread:
A random variable is not a variable, it is a function from the probability space (W) to the set of real numbers (R). And its randomness is completely described by a deterministic probability function with known parameters.
I. Convergence of probability functions (real analysis):
The probability function F: R ---> [0,1] associated with a random variable can be analyzed using the ordinary tools of real analysis. Thus, it can be stated that a sequence of probability functions converge to a limiting probability function, either pointwise (weaker) or uniformly (stronger).
I.A. Pointwise convergence (real analysis):
The sequence {F
n(x)} pointwise converges to F(x) iff for every ε > 0, there is a natural number N
x such that all n ≥ N
x, |F
n(x) − F(x)| < ε.
I.B. Uniform convergence (real analysis):
The sequence {F
n} uniformly converges to F iff for every ε > 0, there is a natural number N such that for all x and all n ≥ N, |F
n(x) − F(x)| < ε.
I.C. Implication (real analysis):
Uniform convergence implies pointwise convergence.
II. Convergence of random variables (probability analysis):
The convergence of random variables is a related but different concept.
II.A. Convergence in distribution (probability analysis):
The weakest concept of convergence of a r.v. is convergence in distribution. A sequence of r.v.'s {X
n} described by respective prob. functions {F
n} is said to converge in distribution to a r.v. X described by a prob. function F iff {F
n(x)} ---> F(x) (pointwise, in the real analytic sense) for all x at which F is continuous.
II.B. Convergence in probability (probability analysis):
Convergence in probability is stronger. {X
n} converges in probability to X iff P(|X
n - X|
> ε) ---> 0 for every ε > 0. This is the concept of convergence used in the weak law of large numbers.
II.C. Almost sure convergence (probability analysis):
Still stronger is almost sure convergence. {X
n} converges almost surely to X iff P(X
n ---> X) = 1.
II.D. Sure convergence or "convergence everywhere" (probability analysis):
The strongest concept of probabilistic convergence is sure convergence (convergence everywhere). {X
n} converges surely (or converges everywhere) to X iff {X
n(w)} ---> X(w) (pointwise, in the real analytic sense) for all w in W.
II.E. Implications (probability analysis):
1. Sure convergence implies almost sure convergence.
2. Almost sure convergence implies convergence in probability.
3. Convergence in probability implies convergence in distribution.
II.F. Concepts yet to be invented:
1. Uniform sure convergence: {X
n} converges "uniformly surely" to X iff {X
n} ---> X uniformly in the real analytic sense.
2. Uniform convergence in distribution: A sequence of r.v.'s {X
n} described by respective prob. functions {F
n} is said to "uniformly converge in distribution" to a r.v. X described by a prob. function F iff {F
n} ---> F uniformly in the real analytic sense.