Solve Bivariate Normal: X,Y,G,F Homework

  • Thread starter Thread starter autobot.d
  • Start date Start date
  • Tags Tags
    Normal
Click For Summary
SUMMARY

The discussion revolves around solving a homework problem related to the independence of random variables X and Y - αX, where |α| < 1, and finding the joint distribution of F = X and G = X + Y. The covariance matrix Σ indicates that X and Y are not independent due to the non-zero covariance (Σ_{1,2} = α). The participants clarify that the covariance calculation leads to the conclusion that cov(X, Y - αX) = 0, proving independence. Additionally, they discuss methods for deriving the joint distribution of F and G, emphasizing the need for convolution and integration techniques.

PREREQUISITES
  • Understanding of multivariate Gaussian distributions
  • Knowledge of covariance and variance calculations
  • Familiarity with convolution of probability distributions
  • Basic concepts of independence in probability theory
NEXT STEPS
  • Study the properties of multivariate Gaussian distributions
  • Learn about covariance matrices and their implications for independence
  • Explore convolution techniques for combining probability distributions
  • Investigate conditional expectation and its applications in probability
USEFUL FOR

Students and researchers in statistics, probability theory, or data science who are working on problems involving multivariate distributions and independence of random variables.

autobot.d
Messages
67
Reaction score
0

Homework Statement


1. Show that X is independent of Y- \alpha X

where |\alpha| &lt; 1

2. Find the joint distribution of F = X and G = X + Y

X,Y,G,F are random variables


Homework Equations


The vector W = (X, Y)^{t} is a 2x1 multivariate Gaussian random vector with zero mean and covariance matrix equal to \Sigma, where \Sigma_{1,1}=1,\Sigma_{1,2} = \alpha, \Sigma_{2,1} = \alpha, \Sigma_{2,2} = 1.

where |\alpha| &lt; 1

The Attempt at a Solution


1.cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?
This would prove what I want but cannot get the last two terms to zero. I figured if I got E(X^{2})=0 I could use Cauchy inequality to prove the other term is zero but I can't get there.

2. Not sure how to start. Any reference that you think might be helpful would be greatly appreciated. Thanks.
 
Last edited:
Physics news on Phys.org
autobot.d said:
1.cov(X,Y- \alpha X) = E(XY)- \alpha E(X^{2}) - E(X)E(Y) + \alpha E(X)E(X) = E(XY) - \alpha E(X^{2}) = 0?
You canceled the wrong pair of terms in that last step, but it still won't lead to the desired answer.
You must have been told something about Y, but it cannot be simply that it is independent of X. E.g. you could set Y deterministic (always taking the same value) and Y-αX would then clearly not be independent of X.
If your 'relevant equations' are indeed relevant then X is a vector, so you need a slightly fancier form of the definition of covariance, involving transposes.
 
No terms were cancelled, X and Y have mean of zero.

And Y is not independent of X as indicated by the covariance matrix.

X is not a vector, sorry for the confusion, renamed vector to W.

That is all the information given.

Also it seems you are talking about using conditional expectation when you speak of being deterministic? Not sure how dependence is clear to you.
 
Your edit certainly cleared things up :approve:.
So I agree now with your expression for cov(X,Y−αX). What do you get if you write out the covariance matrix for W and plug in the given values?
 
How about this:

cov(X,Y) = E(XY) - E(X)E(Y) = E(XY) - 0 = E(XY)
and
var(X) = E(X^{2}) - E(X)E(X) = E(X^{2}) - 0 = E(X^{2})

where

cov(X,Y) = \Sigma_{1,2} = \alpha
and
var(X) = \Sigma_{1,1} = 1

and this make
cov(X,Y-\alpha X) = 0
which implies independence.
Does this look right?

Thanks.
 
Yes.
 
Awesome, thanks for the help. Now where might a good reference be for my second problem.

Is this thought process right?

1. Integrate Y to get the marginal of X to get F
2. Do convolution to get distribution of G = X+Y
3. How do I recover Joint distribution from marginals?
 
Are these the same X and Y as in the first part or arbitrary?
 
They are the same X and Y values.

1 and 2 are totally separate. There is another question about conditional expectation that goes with 2. but I think I can get that part with a little help on 2.


Thanks for the help.
 
  • #10
autobot.d said:
They are the same X and Y values.

1 and 2 are totally separate.
Now I'm confused! If they are the same X and Y as in part 1, how can parts 1 and 2 be totally separate?
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K