# Convergence in distribution

1. Sep 24, 2008

### shan

Given the definition:
For real-valued random variables $$X_n$$, $$n\geq1$$ and X, then $$X_n\stackrel{D}{\rightarrow}X$$ if for every bounded continuous function g: $$R \rightarrow R$$, $$E_n$$[g($$X_n$$)]$$\rightarrow E$$[g(X)]

I want to prove the continuous mapping theorem:
If $$X_n\stackrel{D}{\rightarrow}X$$ then $$h(X_n)\stackrel{D}{\rightarrow}h(X)$$ for any continuous function h: $$R \rightarrow R$$
without using Skorokhod's representation theorem.

The theorem makes sense to me intuitively but I'm lost as to how to prove it mathematically.

Edit: apologies for the really bad latex, my browser keeps hanging on the preview/save

Last edited: Sep 24, 2008
2. Sep 28, 2008

### shan

If anyone was interested:

Say $$h(Y_n) = Z_n, h(Y) = Z$$

$$E(g(Z_n)) \rightarrow E(g(Z))$$ for every g that is bounded and continuous (from definition)

$$E(f(Y_n)) \rightarrow E(f(Y))$$ for every f that is bounded and continuous (from definition)

$$E(g(h(Y_n)) \rightarrow E(g(h(Y))$$ is true because h is continuous and g o h is also continuous, h is also bounded by g

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook