Convergence of random variables.

trash
Messages
13
Reaction score
0

Homework Statement


Given a sequence of independent random variables {X_n}, each one with distribution Exp(1). Show that Y_n = \displaystyle\frac{X_n}{\log(n)} with n \geq 2 converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each X_n given by f(x_n) = e^{-x_n} if x_n \geq 0 and 0 otherwise.


The Attempt at a Solution


Since \displaystyle\frac{e^{-x_n}}{\log(n)} tends to 0 as n \rightarrow +\infty, given \epsilon > 0, then there's a N>0 such that if n>N we have \displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon. This implies that \displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1.

Now, what about almost surely convergence?.
I have to prove that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1 but it seems to me that since \displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 will follow that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1 .
 
Physics news on Phys.org
trash said:

Homework Statement


Given a sequence of independent random variables {X_n}, each one with distribution Exp(1). Show that Y_n = \displaystyle\frac{X_n}{\log(n)} with n \geq 2 converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each X_n given by f(x_n) = e^{-x_n} if x_n \geq 0 and 0 otherwise.


The Attempt at a Solution


Since \displaystyle\frac{e^{-x_n}}{\log(n)} tends to 0 as n \rightarrow +\infty, given \epsilon > 0, then there's a N>0 such that if n>N we have \displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon. This implies that \displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1.

Now, what about almost surely convergence?.
I have to prove that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1 but it seems to me that since \displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 will follow that P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1 .

Your notation is unfortunate, and confuses the issues. The random variables are ##X_n##, but their possible values are just ##x##; that is, we speak of ##F_n(x) = P\{ X_n \leq x \},## etc., where the ##x## has no ##n-##subscript. The point is that convergence in probability of ##Y_n \equiv X_n / \ln(n)## says something about how the functions ##F_n(y) = P\{ Y_n \leq y \}## behave as ##n \to \infty##; here, ##y## does not vary with ##n.## What you have managed to do is more-or-less correctly show convergence in probability.

Your argument does NOT show the lack of almost-sure convergence. What you need to do is show that the event
E = \{ \lim_{n \to \infty} Y_n = 0\} satisfies ##P(E) < 1.## Alternatively, you can try to show that ##P(E^c) > 0,##, where ##E^c## is the complement of ##E.## What you did was improperly remove the arguments inside the limit when you wrote ##\lim_{n \to \infty} Y_n = 0##; this is meaningless as written, because there are several possible definitions of ##``\lim'', ## (convergence in probability, mean-square convergence, ##L^1## convergence, a.s. convergence, etc) and you have given no indication of which one you intend.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top