Convergence of random variables.

  • Thread starter trash
  • Start date
  • #1
14
0

Homework Statement


Given a sequence of independent random variables [itex]{X_n}[/itex], each one with distribution Exp(1). Show that [itex]Y_n = \displaystyle\frac{X_n}{\log(n)}[/itex] with [itex]n \geq 2[/itex] converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each [itex]X_n[/itex] given by [itex]f(x_n) = e^{-x_n}[/itex] if [itex]x_n \geq 0[/itex] and 0 otherwise.


The Attempt at a Solution


Since [itex]\displaystyle\frac{e^{-x_n}}{\log(n)}[/itex] tends to 0 as [itex]n \rightarrow +\infty[/itex], given [itex]\epsilon > 0[/itex], then there's a [itex]N>0[/itex] such that if [itex]n>N[/itex] we have [itex]\displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon[/itex]. This implies that [itex]\displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1[/itex].

Now, what about almost surely convergence?.
I have to prove that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1[/itex] but it seems to me that since [itex]\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0[/itex] will follow that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1[/itex] .
 

Answers and Replies

  • #2
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722

Homework Statement


Given a sequence of independent random variables [itex]{X_n}[/itex], each one with distribution Exp(1). Show that [itex]Y_n = \displaystyle\frac{X_n}{\log(n)}[/itex] with [itex]n \geq 2[/itex] converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each [itex]X_n[/itex] given by [itex]f(x_n) = e^{-x_n}[/itex] if [itex]x_n \geq 0[/itex] and 0 otherwise.


The Attempt at a Solution


Since [itex]\displaystyle\frac{e^{-x_n}}{\log(n)}[/itex] tends to 0 as [itex]n \rightarrow +\infty[/itex], given [itex]\epsilon > 0[/itex], then there's a [itex]N>0[/itex] such that if [itex]n>N[/itex] we have [itex]\displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon[/itex]. This implies that [itex]\displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1[/itex].

Now, what about almost surely convergence?.
I have to prove that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1[/itex] but it seems to me that since [itex]\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0[/itex] will follow that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1[/itex] .

Your notation is unfortunate, and confuses the issues. The random variables are ##X_n##, but their possible values are just ##x##; that is, we speak of ##F_n(x) = P\{ X_n \leq x \},## etc., where the ##x## has no ##n-##subscript. The point is that convergence in probability of ##Y_n \equiv X_n / \ln(n)## says something about how the functions ##F_n(y) = P\{ Y_n \leq y \}## behave as ##n \to \infty##; here, ##y## does not vary with ##n.## What you have managed to do is more-or-less correctly show convergence in probability.

Your argument does NOT show the lack of almost-sure convergence. What you need to do is show that the event
[tex] E = \{ \lim_{n \to \infty} Y_n = 0\} [/tex] satisfies ##P(E) < 1.## Alternatively, you can try to show that ##P(E^c) > 0,##, where ##E^c## is the complement of ##E.## What you did was improperly remove the arguments inside the limit when you wrote ##\lim_{n \to \infty} Y_n = 0##; this is meaningless as written, because there are several possible definitions of ##``\lim'', ## (convergence in probability, mean-square convergence, ##L^1## convergence, a.s. convergence, etc) and you have given no indication of which one you intend.
 

Related Threads on Convergence of random variables.

Replies
0
Views
6K
  • Last Post
Replies
3
Views
495
  • Last Post
Replies
1
Views
4K
  • Last Post
Replies
2
Views
905
  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
5
Views
1K
  • Last Post
Replies
6
Views
650
  • Last Post
Replies
2
Views
993
  • Last Post
Replies
1
Views
642
Top