# Bivariate normal

## Homework Statement

1. Show that $$X$$ is independent of $$Y- \alpha X$$

where $$|\alpha| < 1$$

2. Find the joint distribution of F = X and G = X + Y

X,Y,G,F are random variables

## Homework Equations

The vector $$W = (X, Y)^{t}$$ is a 2x1 multivariate Gaussian random vector with zero mean and covariance matrix equal to $$\Sigma$$, where $$\Sigma_{1,1}=1,\Sigma_{1,2} = \alpha, \Sigma_{2,1} = \alpha, \Sigma_{2,2} = 1.$$

where $$|\alpha| < 1$$

## The Attempt at a Solution

1.$$cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?$$
This would prove what I want but cannot get the last two terms to zero. I figured if I got $$E(X^{2})=0$$ I could use Cauchy inequality to prove the other term is zero but I can't get there.

2. Not sure how to start. Any reference that you think might be helpful would be greatly appreciated. Thanks.

Last edited:

haruspex
Homework Helper
Gold Member
2020 Award
1.$$cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?$$
You cancelled the wrong pair of terms in that last step, but it still won't lead to the desired answer.
You must have been told something about Y, but it cannot be simply that it is independent of X. E.g. you could set Y deterministic (always taking the same value) and Y-αX would then clearly not be independent of X.
If your 'relevant equations' are indeed relevant then X is a vector, so you need a slightly fancier form of the definition of covariance, involving transposes.

No terms were cancelled, X and Y have mean of zero.

And Y is not independent of X as indicated by the covariance matrix.

X is not a vector, sorry for the confusion, renamed vector to W.

That is all the information given.

Also it seems you are talking about using conditional expectation when you speak of being deterministic? Not sure how dependence is clear to you.

haruspex
Homework Helper
Gold Member
2020 Award
Your edit certainly cleared things up .
So I agree now with your expression for cov(X,Y−αX). What do you get if you write out the covariance matrix for W and plug in the given values?

$$cov(X,Y) = E(XY) - E(X)E(Y) = E(XY) - 0 = E(XY)$$
and
$$var(X) = E(X^{2}) - E(X)E(X) = E(X^{2}) - 0 = E(X^{2})$$

where

$$cov(X,Y) = \Sigma_{1,2} = \alpha$$
and
$$var(X) = \Sigma_{1,1} = 1$$

and this make
$$cov(X,Y-\alpha X) = 0$$
which implies independence.
Does this look right?

Thanks.

haruspex
Homework Helper
Gold Member
2020 Award
Yes.

Awesome, thanks for the help. Now where might a good reference be for my second problem.

Is this thought process right?

1. Integrate Y to get the marginal of X to get F
2. Do convolution to get distribution of G = X+Y
3. How do I recover Joint distribution from marginals?

haruspex
Homework Helper
Gold Member
2020 Award
Are these the same X and Y as in the first part or arbitrary?

They are the same X and Y values.

1 and 2 are totally separate. There is another question about conditional expectation that goes with 2. but I think I can get that part with a little help on 2.

Thanks for the help.

haruspex