I was reading some proofs about the convergence of random variables, and here are the little bits that I couldn't figure out...(adsbygoogle = window.adsbygoogle || []).push({});

1) Let X_{n}be a sequence of random variables, and let X_{n}_{k}be a subsequence of it. If X_{n}conveges in probability to X, then X_{n}_{k}also conveges in probability to X. WHY?

2) I was looking at a theorem: if E(Y)<∞, then Y<∞ almost surely. Now I am puzzled by the notation.What does it MEAN to say that Y=∞ or Y<∞?

For example, if Y is a Poisson random variable, then the possible values are 0,1,2,..., (there is no upper bound). Is it true to say that Y=∞ in this case?

3) If X_{n}^{4}converges to 0 almost surely, then is it true to say that X_{n}also converges to 0 almost surely? Why or why not?

4) The moment generating function(mgf) determines the distribution uniquely, so we can use mgf to find the distributions of random varibles. If the mgf already does the job, what is thepointof introducing the "characteristic function"?

Can someone please explain?

Any help is much appreciated! :)

**Physics Forums - The Fusion of Science and Community**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Convergence of random variables

Loading...

Similar Threads - Convergence random variables | Date |
---|---|

Cdf of a discrete random variable and convergence of distributions | Jan 26, 2013 |

Random Variables: Convergence in Probability? | Jun 29, 2011 |

Convergence of Random Variables on Discrete Prob Spaces | Sep 27, 2010 |

Probability: Infinite Convergent Series and Random Variables | Feb 22, 2010 |

Convergence of Random Variables | Apr 24, 2007 |

**Physics Forums - The Fusion of Science and Community**