Convergence of Random Variables in Probability

In summary, Chebyshev's inequality states that the probability of a random variable being a certain distance away from its mean is bounded by the reciprocal of the squared distance. In this case, we can use this inequality to prove that Yn/n converges in probability to p by setting a specific value of k, and showing that the probability of the distance between Yn/n and p being greater than or equal to that value approaches 0 as n approaches infinity. Similarly, we can use Chebyshev's inequality to prove that 1 - Yn/n converges to 1 - p, and that (Yn/n)(1 - Yn/n) converges in probability to p(1-p).
  • #1
cse63146
452
0

Homework Statement



Let the random variable Yn have the distribution b(n,p).

a)Prove that Yn/n converges in probability p.

b)Prove that 1 - Yn/n converges to 1 - p.

c)Prove that (Yn/n)(1 - Yn/n) converges in probability to p(1-p)

Homework Equations





The Attempt at a Solution



Note: when lim -> the limit of as n approaches infinity.

a) lim Yn/n = lim Yn * lim 1/n.

But the lim 1/n = 0 => lim Yn/n = 0. But it's supposed to converge to p.

Where did I make the mistake?
 
Physics news on Phys.org
  • #2
You can't take the limit of the 1/n part independently of the Yn part.

That is like saying that

1 goes to 0 as n goes to infinity since

[tex]1=\frac{n}{n} = n \cdot \frac{1}{n}[/tex] and that 1/n goes to 0!
 
  • #3
a)Prove that Yn/n converges in probability p.

I assume you mean Yn/n converges in probability to p.

What is your defn of "converges in probability"? Are you supposed to show that for each [tex]\varepsilon>0[/tex],

[tex]P(|Y_n/n - p|\ge \varepsilon)\to 0[/tex] as [tex]n\to\infty[/tex]?

Have you learned Chebyshev's inequality?
 
  • #4
So I tried doing it with Yn, and I tried "splitting" the limits up:

lim n!/n! * lim(n-x!)-1 * lim px * lim(1-p)n * lim(1-p)-x * lim n-1

lim n!/n! = 1 so:

lim(n-x!)-1 * lim px * lim(1-p)n * lim(1-p)-x * lim n-1

Still stuck though.

Didn't see your message.

Chebyshev's inequality: [tex]P(|X - \mu |\geq k \sigma) \leq 1/k^2[/tex]

would mu be np (because this is a binomial distribution)?
 
Last edited:
  • #5
cse63146 said:
So I tried doing it with Yn, and I tried "splitting" the limits up:

lim n!/n! * lim(n-x!)-1 * lim px * lim(1-p)n * lim(1-p)-x * lim n-1

lim n!/n! = 1 so:

lim(n-x!)-1 * lim px * lim(1-p)n * lim(1-p)-x * lim n-1

Still stuck though.

This is not even close to the correct method. See my post above (which got posted while you were writing).
 
  • #6
[tex]P(|\frac{Y_n}{n} - np |\geq p \sqrt{np(1-p)}) \leq \frac{1}{p^2}[/tex]
 

Related to Convergence of Random Variables in Probability

1. What is the concept of convergence of probability?

The convergence of probability is a fundamental concept in probability theory and statistics. It refers to the idea that as the number of trials or observations increases, the empirical probability of an event approaches the theoretical probability of that event.

2. Why is convergence of probability important?

Convergence of probability is important because it allows us to make more accurate predictions and inferences based on data. It also provides a way to test the validity of statistical models and assumptions.

3. What are the different types of convergence of probability?

There are several types of convergence of probability, including almost sure convergence, convergence in probability, and convergence in distribution. These types differ in their criteria for convergence and the types of random variables they apply to.

4. How is convergence of probability related to the law of large numbers?

The law of large numbers is a key principle in probability theory that states that as the number of trials or observations increases, the sample mean of a random variable will converge to its true mean. This is closely related to the concept of convergence of probability, as it shows how the empirical probability of an event will converge to its theoretical probability with more data.

5. What are some applications of convergence of probability?

Convergence of probability has various applications in fields such as statistics, finance, and machine learning. It is used to analyze and make predictions based on large data sets, assess the performance of statistical models, and evaluate the risk of financial investments.

Similar threads

  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
319
  • Calculus and Beyond Homework Help
Replies
2
Views
732
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
382
  • Calculus and Beyond Homework Help
Replies
4
Views
378
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
6
Views
6K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
852
Back
Top