Solve Bivariate Normal: X,Y,G,F Homework

  • Thread starter Thread starter autobot.d
  • Start date Start date
  • Tags Tags
    Normal
Click For Summary

Homework Help Overview

The discussion revolves around a problem involving bivariate normal random variables X and Y, specifically addressing their independence and joint distribution with respect to derived variables F and G. The participants explore the implications of a covariance matrix and the conditions under which independence can be established.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Problem interpretation

Approaches and Questions Raised

  • Participants examine the covariance between X and Y - αX, questioning the cancellation of terms and the implications of the covariance matrix. There is also exploration of how to derive the joint distribution of F and G from the original variables.

Discussion Status

The discussion is active, with participants providing insights and corrections regarding the covariance calculations. Some guidance has been offered on the independence condition, while questions remain about the relationship between the two parts of the problem and the methods for finding the joint distribution.

Contextual Notes

There is a noted lack of explicit information regarding the nature of Y and its relationship to X, which is central to the independence question. Additionally, the participants are navigating the distinction between the two parts of the problem, which may lead to confusion regarding their interdependence.

autobot.d
Messages
67
Reaction score
0

Homework Statement


1. Show that X is independent of Y- \alpha X

where |\alpha| < 1

2. Find the joint distribution of F = X and G = X + Y

X,Y,G,F are random variables


Homework Equations


The vector W = (X, Y)^{t} is a 2x1 multivariate Gaussian random vector with zero mean and covariance matrix equal to \Sigma, where \Sigma_{1,1}=1,\Sigma_{1,2} = \alpha, \Sigma_{2,1} = \alpha, \Sigma_{2,2} = 1.

where |\alpha| < 1

The Attempt at a Solution


1.cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?
This would prove what I want but cannot get the last two terms to zero. I figured if I got E(X^{2})=0 I could use Cauchy inequality to prove the other term is zero but I can't get there.

2. Not sure how to start. Any reference that you think might be helpful would be greatly appreciated. Thanks.
 
Last edited:
Physics news on Phys.org
autobot.d said:
1.cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?
You canceled the wrong pair of terms in that last step, but it still won't lead to the desired answer.
You must have been told something about Y, but it cannot be simply that it is independent of X. E.g. you could set Y deterministic (always taking the same value) and Y-αX would then clearly not be independent of X.
If your 'relevant equations' are indeed relevant then X is a vector, so you need a slightly fancier form of the definition of covariance, involving transposes.
 
No terms were cancelled, X and Y have mean of zero.

And Y is not independent of X as indicated by the covariance matrix.

X is not a vector, sorry for the confusion, renamed vector to W.

That is all the information given.

Also it seems you are talking about using conditional expectation when you speak of being deterministic? Not sure how dependence is clear to you.
 
Your edit certainly cleared things up :approve:.
So I agree now with your expression for cov(X,Y−αX). What do you get if you write out the covariance matrix for W and plug in the given values?
 
How about this:

cov(X,Y) = E(XY) - E(X)E(Y) = E(XY) - 0 = E(XY)
and
var(X) = E(X^{2}) - E(X)E(X) = E(X^{2}) - 0 = E(X^{2})

where

cov(X,Y) = \Sigma_{1,2} = \alpha
and
var(X) = \Sigma_{1,1} = 1

and this make
cov(X,Y-\alpha X) = 0
which implies independence.
Does this look right?

Thanks.
 
Yes.
 
Awesome, thanks for the help. Now where might a good reference be for my second problem.

Is this thought process right?

1. Integrate Y to get the marginal of X to get F
2. Do convolution to get distribution of G = X+Y
3. How do I recover Joint distribution from marginals?
 
Are these the same X and Y as in the first part or arbitrary?
 
They are the same X and Y values.

1 and 2 are totally separate. There is another question about conditional expectation that goes with 2. but I think I can get that part with a little help on 2.


Thanks for the help.
 
  • #10
autobot.d said:
They are the same X and Y values.

1 and 2 are totally separate.
Now I'm confused! If they are the same X and Y as in part 1, how can parts 1 and 2 be totally separate?
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K