1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Convergence of random variables.

  1. Jul 6, 2013 #1
    1. The problem statement, all variables and given/known data
    Given a sequence of independent random variables [itex]{X_n}[/itex], each one with distribution Exp(1). Show that [itex]Y_n = \displaystyle\frac{X_n}{\log(n)}[/itex] with [itex]n \geq 2[/itex] converges to 0 in probability but it doesn't coverges almost surely to 0.


    2. Relevant equations
    Density for each [itex]X_n[/itex] given by [itex]f(x_n) = e^{-x_n}[/itex] if [itex]x_n \geq 0[/itex] and 0 otherwise.


    3. The attempt at a solution
    Since [itex]\displaystyle\frac{e^{-x_n}}{\log(n)}[/itex] tends to 0 as [itex]n \rightarrow +\infty[/itex], given [itex]\epsilon > 0[/itex], then there's a [itex]N>0[/itex] such that if [itex]n>N[/itex] we have [itex]\displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon[/itex]. This implies that [itex]\displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1[/itex].

    Now, what about almost surely convergence?.
    I have to prove that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1[/itex] but it seems to me that since [itex]\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0[/itex] will follow that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1[/itex] .
     
  2. jcsd
  3. Jul 6, 2013 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Your notation is unfortunate, and confuses the issues. The random variables are ##X_n##, but their possible values are just ##x##; that is, we speak of ##F_n(x) = P\{ X_n \leq x \},## etc., where the ##x## has no ##n-##subscript. The point is that convergence in probability of ##Y_n \equiv X_n / \ln(n)## says something about how the functions ##F_n(y) = P\{ Y_n \leq y \}## behave as ##n \to \infty##; here, ##y## does not vary with ##n.## What you have managed to do is more-or-less correctly show convergence in probability.

    Your argument does NOT show the lack of almost-sure convergence. What you need to do is show that the event
    [tex] E = \{ \lim_{n \to \infty} Y_n = 0\} [/tex] satisfies ##P(E) < 1.## Alternatively, you can try to show that ##P(E^c) > 0,##, where ##E^c## is the complement of ##E.## What you did was improperly remove the arguments inside the limit when you wrote ##\lim_{n \to \infty} Y_n = 0##; this is meaningless as written, because there are several possible definitions of ##``\lim'', ## (convergence in probability, mean-square convergence, ##L^1## convergence, a.s. convergence, etc) and you have given no indication of which one you intend.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Convergence of random variables.
  1. Random variable (Replies: 1)

  2. Random variable (Replies: 2)

Loading...