Two normal independent random variables

In summary: As for the second part, you make a valid point about using covariance to show independence. However, I believe the author was hinting at using MGFs as the alternative method in advanced calculus.
  • #1
WMDhamnekar
MHB
376
28
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?
 
Last edited:
Physics news on Phys.org
  • #2
Dhamnekar Winod said:
Let X and Y be independent normal random variables each having parameters $\mu$ and $\sigma^2$. I want to show that X+Y is independet of X-Y without using Jacobian transformation.

Hint given by author:- Find their joint moment generating functions.

Answer: Now Joint MGf of $X+Y={e^{\mu}}^{2t}+\sigma^2t^2$ and of $X-Y=1$. So, joint MGF of $X+Y+X-Y$ is $e^{2\mu t}+ \sigma^2 t^2$. This indicates they are independent. Is their any other method in advanced calculus?

There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
 
Last edited:
  • #3
\
steep said:
There are some problems with your use of MGFs. The sigma should be inside the exponential function for the $X+Y$ though you seem to have done this for $X-Y$. But that has another issue:: X-Y is zero mean but still normal, yet a zero mean random variable has zero variance iff it is identically zero (almost surely). Your MGF implies zero variance yet, I can use coin tossing to show there is at least 25% chance of $X-Y \gt 0$ and at least 25% chance $X-Y \lt 0$ which is a contradiction. (It's actually 50:50, but lower bounding this at 25% each is a much easier argument and gets the same desired contradiction.)

Why not consider tackling this head on by looking at covariance?

$A: = X+Y$
$B: = X-Y$

where $A$ and $B$ are normal random variables.

In general independence implies zero covariance but not the other way around. Except zero covariance does imply independence for normal random variables.

$\text{Cov}(A,B) $
$= \text{Cov}(X+Y, X-Y) = E\big[(X+Y)(X-Y)\big] - E\big[(X+Y)\big]E\big[(X-Y)\big] = E\big[X^2 -Y^2\big] - E\big[X+Y\big]E\big[X-Y\big]$
$ = E\big[X^2\big] -E\big[Y^2\big] - E\big[X+Y\big]\cdot 0 = 0$

(The fact that X and Y have same first moment and variance implies second moment is the same and that
$E\big[X - Y\big] = E\big[X\big] - E\big[Y\big] = 0$.)
You are correct. It is a typo. The joint MGF of X+Y=$e^{2\mu t+ \sigma^2 t^2}.$ The joint MGF of X+Y+X-Y=$e^{2\mu t + \sigma^2 t^2}$
 

1. What are two normal independent random variables?

Two normal independent random variables are two random variables that follow a normal distribution and are not affected by each other's values. This means that the value of one variable does not impact the value of the other variable.

2. How are two normal independent random variables related?

Two normal independent random variables are not related to each other in any way. They are completely independent and do not influence each other's values.

3. What is the difference between normal and independent random variables?

Normal random variables follow a normal distribution, while independent random variables do not have any relationship or influence on each other's values. A normal distribution is a specific type of probability distribution that is often used to model real-world data.

4. Can two normal independent random variables have a correlation?

No, two normal independent random variables cannot have a correlation. Correlation measures the relationship between two variables, and since independent variables do not have any relationship, they cannot have a correlation.

5. How are two normal independent random variables used in statistics?

In statistics, two normal independent random variables are used to model and analyze data. They can be used to make predictions, test hypotheses, and estimate probabilities in various scenarios. They are also commonly used in regression analysis and ANOVA (analysis of variance).

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
456
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
798
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
111
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
824
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
890
Back
Top