Random Variables: Convergence in Probability?

In summary, we say that a sequence of random variables, X1, X2, ... converges to a random variable X in probability if for each epsilon greater than 0, the probability of the absolute value of the difference between Xn and X being greater than or equal to epsilon approaches 0 as n approaches infinity. This notation is interchangeable with P({ω E Ω: |Xn(ω)-X(ω)|≥ε}) and does not depend on whether we use ≥ε or >ε in the definition.
  • #1
kingwinner
1,270
0
Definition: Let X1,X2,... be a sequence of random variables defined on a sample space S. We say that Xn converges to a random variable X in probability if for each ε>0, P(|Xn-X|≥ε)->0 as n->∞.
====================================

Now I don't really understand the meaning of |Xn-X| used in the definition. Is |.| here the usual absolute value when we talk about real numbers? But Xn and X are functions, not real numbers.
Also, when we talk about the probability of something, that something has to be subsets of the sample space Ω, but |Xn-X|≥ε does not look like the description of a subset of Ω to me.
A random variable X is a function mapping the sample space Ω to the set of real numbers, i.e. X: Ω->R.
The random variable X is the function itself, and X(ω) are the VALUES of the function which are real numbers. Only when we talk about X(ω) does it make sense to talk about the absolute value of real nubmers.

So I assume the notation used in the definition above is really a shorthand for
P({ω E Ω: |Xn(ω)-X(ω)|≥ε})?

In other words, the notations P(|Xn-X|≥ε) and P({ω E Ω: |Xn(ω)-X(ω)|≥ε}) are interchangable? Am I right?


I hope someone can clarify this! Thanks a lot!:smile:
 
Physics news on Phys.org
  • #2
Hi kingwinner! :smile:

It seems you answered your own question:

So I assume the notation used in the definition above is really a shorthand for
P({ω E Ω: |Xn(ω)-X(ω)|≥ε})?

In other words, the notations P(|Xn-X|≥ε) and P({ω E Ω: |Xn(ω)-X(ω)|≥ε}) are interchangable? Am I right?

You are 100% correct about this.
 
  • #3
I see, thanks!
So I believe the |.| used in |X_n -X| above is just meant to be the usual absolute value for real numbers, and not some fancy metric, right?
 
  • #4
kingwinner said:
I see, thanks!
So I believe the |.| used in |X_n -X| above is just meant to be the usual absolute value for real numbers, and not some fancy metric, right?

It's just the absolute value, don't worry :smile:
 
  • #5
By the way, does it matter whether we have ≥ε OR >ε in the definition? Why or why not?
 
  • #6
kingwinner said:
By the way, does it matter whether we have ≥ε OR >ε in the definition? Why or why not?

No, it doesnt. If [itex]P(|X_n-X|>\varepsilon)[/itex] converges to zero, then so does [itex]P(|X_n-X|\geq \varepsilon)[/itex].

The reason is

[tex]P(|X_n-X|>\varepsilon)\leq P(|X_n-X|\geq \varepsilon)\leq P(|X_n-X|>\varepsilon/2)[/tex]

So since epsilon is arbitrary, all the sequences will converge together.
 
  • #7
Got it! Thanks!
 

1. What is a random variable?

A random variable is a numerical value that is assigned to each possible outcome of a random event. It represents the uncertainty in the outcome of an experiment or observation.

2. What does it mean for a sequence of random variables to converge in probability?

A sequence of random variables converges in probability if the probability of the variables approaching a certain value becomes increasingly certain as the number of variables in the sequence increases.

3. How is convergence in probability different from other types of convergence?

Convergence in probability differs from other types of convergence, such as almost sure convergence or convergence in distribution, in that it focuses on the probability of a sequence of random variables approaching a certain value rather than the actual value itself.

4. What is the significance of convergence in probability in statistical analysis?

Convergence in probability is an important concept in statistical analysis because it allows us to make inferences about a population based on a sample. It helps us understand the behavior of random variables and their relationship to each other.

5. How is convergence in probability tested or evaluated?

Convergence in probability can be evaluated using mathematical theorems and limit theorems, such as the law of large numbers and the central limit theorem. These theorems provide conditions for when convergence in probability can be assumed to occur.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
515
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
423
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
450
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
46
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
858
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top