Two normal independent random variables

Click For Summary
SUMMARY

The discussion centers on proving that the sum and difference of two independent normal random variables, X and Y, are independent. The joint moment generating functions (MGFs) of X+Y and X-Y are derived, with the correct formulation being MGF of X+Y = $e^{2\mu t + \sigma^2 t^2}$. The covariance approach is also highlighted, confirming that zero covariance implies independence for normal variables. The analysis reveals that the initial MGF calculations contained errors, particularly regarding the placement of the variance term.

PREREQUISITES
  • Understanding of normal random variables and their properties
  • Knowledge of moment generating functions (MGFs)
  • Familiarity with covariance and its implications for independence
  • Basic calculus for evaluating expectations and variances
NEXT STEPS
  • Study the properties of moment generating functions in depth
  • Learn about covariance and its role in determining independence among random variables
  • Explore advanced calculus techniques for analyzing random variables
  • Investigate the implications of zero covariance in different types of distributions
USEFUL FOR

Statisticians, data scientists, and mathematicians interested in probability theory, particularly those working with normal distributions and their properties.

WMDhamnekar
MHB
Messages
378
Reaction score
30
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?
 
Last edited:
Physics news on Phys.org
Dhamnekar Winod said:
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?

There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
 
Last edited:
\
steep said:
There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
You are correct. It is a typo. The joint MGF of X+Y=$e^{2\mu t+ \sigma^2 t^2}.$ The joint MGF of X+Y+X-Y=$e^{2\mu t + \sigma^2 t^2}$
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K