MHB Two normal independent random variables

AI Thread Summary
The discussion focuses on demonstrating that the sum and difference of two independent normal random variables, X and Y, are independent without using Jacobian transformation. The joint moment generating functions (MGFs) of X+Y and X-Y are analyzed, revealing issues with earlier calculations, particularly regarding the variance of X-Y. A suggestion is made to approach the problem using covariance, noting that for normal variables, zero covariance implies independence. The corrected joint MGF for X+Y is confirmed to be e^(2μt + σ²t²). The conversation emphasizes the importance of accurate MGF calculations and the relationship between covariance and independence in normal distributions.
WMDhamnekar
MHB
Messages
376
Reaction score
28
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?
 
Last edited:
Physics news on Phys.org
Dhamnekar Winod said:
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?

There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
 
Last edited:
\
steep said:
There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
You are correct. It is a typo. The joint MGF of X+Y=$e^{2\mu t+ \sigma^2 t^2}.$ The joint MGF of X+Y+X-Y=$e^{2\mu t + \sigma^2 t^2}$
 
Back
Top