Convergence of random variables.

Click For Summary
SUMMARY

The discussion centers on the convergence properties of a sequence of independent random variables {X_n} with distribution Exp(1). It is established that Y_n = X_n / log(n) converges to 0 in probability as n approaches infinity, but does not converge almost surely to 0. The density function for each X_n is given by f(x_n) = e^{-x_n} for x_n ≥ 0. The key argument involves demonstrating that the probability P{lim_{n→∞} Y_n = 0} is not equal to 1, indicating the absence of almost sure convergence.

PREREQUISITES
  • Understanding of independent random variables
  • Familiarity with the exponential distribution, specifically Exp(1)
  • Knowledge of convergence concepts in probability theory
  • Proficiency in limit notation and probability measures
NEXT STEPS
  • Study the properties of convergence in probability versus almost sure convergence
  • Learn about the law of large numbers and its implications for random variables
  • Explore the concept of probability measures and their complements
  • Investigate the behavior of sequences of random variables and their distributions
USEFUL FOR

Mathematicians, statisticians, and students studying probability theory, particularly those interested in the convergence properties of random variables and their applications in statistical analysis.

trash
Messages
13
Reaction score
0

Homework Statement


Given a sequence of independent random variables {X_n}, each one with distribution Exp(1). Show that Y_n = \displaystyle\frac{X_n}{\log(n)} with n \geq 2 converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each X_n given by f(x_n) = e^{-x_n} if x_n \geq 0 and 0 otherwise.


The Attempt at a Solution


Since \displaystyle\frac{e^{-x_n}}{\log(n)} tends to 0 as n \rightarrow +\infty, given \epsilon > 0, then there's a N>0 such that if n>N we have \displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon. This implies that \displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1.

Now, what about almost surely convergence?.
I have to prove that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1 but it seems to me that since \displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 will follow that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1 .
 
Physics news on Phys.org
trash said:

Homework Statement


Given a sequence of independent random variables {X_n}, each one with distribution Exp(1). Show that Y_n = \displaystyle\frac{X_n}{\log(n)} with n \geq 2 converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each X_n given by f(x_n) = e^{-x_n} if x_n \geq 0 and 0 otherwise.


The Attempt at a Solution


Since \displaystyle\frac{e^{-x_n}}{\log(n)} tends to 0 as n \rightarrow +\infty, given \epsilon > 0, then there's a N>0 such that if n>N we have \displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon. This implies that \displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1.

Now, what about almost surely convergence?.
I have to prove that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1 but it seems to me that since \displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 will follow that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1 .

Your notation is unfortunate, and confuses the issues. The random variables are ##X_n##, but their possible values are just ##x##; that is, we speak of ##F_n(x) = P\{ X_n \leq x \},## etc., where the ##x## has no ##n-##subscript. The point is that convergence in probability of ##Y_n \equiv X_n / \ln(n)## says something about how the functions ##F_n(y) = P\{ Y_n \leq y \}## behave as ##n \to \infty##; here, ##y## does not vary with ##n.## What you have managed to do is more-or-less correctly show convergence in probability.

Your argument does NOT show the lack of almost-sure convergence. What you need to do is show that the event
E = \{ \lim_{n \to \infty} Y_n = 0\} satisfies ##P(E) < 1.## Alternatively, you can try to show that ##P(E^c) > 0,##, where ##E^c## is the complement of ##E.## What you did was improperly remove the arguments inside the limit when you wrote ##\lim_{n \to \infty} Y_n = 0##; this is meaningless as written, because there are several possible definitions of ##``\lim'', ## (convergence in probability, mean-square convergence, ##L^1## convergence, a.s. convergence, etc) and you have given no indication of which one you intend.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
34
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
11
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 34 ·
2
Replies
34
Views
4K
  • · Replies 13 ·
Replies
13
Views
2K