MHB Two normal independent random variables

WMDhamnekar
MHB
Messages
376
Reaction score
28
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?
 
Last edited:
Physics news on Phys.org
Dhamnekar Winod said:
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?

There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
 
Last edited:
\
steep said:
There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
You are correct. It is a typo. The joint MGF of X+Y=$e^{2\mu t+ \sigma^2 t^2}.$ The joint MGF of X+Y+X-Y=$e^{2\mu t + \sigma^2 t^2}$
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top