Gregg
- 452
- 0
Homework Statement
X, Y, (X_n)_{n>0} \text{ and } (Y_n)_{n>0} are random variables.
Show that if
X_n \xrightarrow{\text{P}} X and Y_n \xrightarrow{\text{P}} Y then X_n + Y_n \xrightarrow{\text{P}} X + Y
Homework Equations
If X_n \xrightarrow{\text{P}} X then \text{Pr}(|X_n-X|>\epsilon)=0 \text{ } \forall \epsilon > 0 \text{ as } n \to \infty
The Attempt at a Solution
First, let the sets A_n(\epsilon) = \{|X_n - X|<\epsilon\} and B_n(\epsilon) = \{|Y_n - Y|<\epsilon\}
The sum of the two moduli will always be less than 2\epsilon if both of the moduli are less than \epsilon but the converse is not generally true.
C_n(\epsilon)=\{|X_n-X|+|Y_n-Y|<2\epsilon\}\supset{A_n(\epsilon)\cap B_n(\epsilon)}
Using the triangle inequality:
|X_n + Y_n - X - Y | \le |X_n-X|+|Y_n-Y|
D_n(\epsilon) =\{|X_n-X+Y_n-Y|<2\epsilon\} \supset C_n(\epsilon)
I think this has gone wrong in several places but from here I hope to say that
\text{Pr}(D_n) \ge \text{Pr}(C_n) \ge \text{Pr}(A_n\cap B_n ) \ge \text{Pr}(A_n) \to 1 \text{ as } n\to\infty
\text{Pr}(D_n^c) \to 0 \text{ as } n\to \infty