Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Problem with calculating the cov matrix of X,Y

  1. Jun 10, 2015 #1

    ChrisVer

    User Avatar
    Gold Member

    If I have two random variables [itex]X, Y[/itex] that are given from the following formula:
    [itex] X= \mu_x \big(1 + G_1(0, \sigma_1) + G_2(0, \sigma_2) \big) [/itex]
    [itex] Y= \mu_y \big(1 + G_3(0, \sigma_1) + G_2(0, \sigma_2) \big)[/itex]

    Where [itex]G(\mu, \sigma)[/itex] are gaussians with mean [itex]\mu=0[/itex] here and std some number.

    How can I find the covariance matrix of those two?

    I guess the variance will be given by:
    [itex]Var(X) = \mu_x^2 (\sigma_1^2+ \sigma_2^2)[/itex] and similarly for Y. But I don't know how I can work to find the covariance?
    Could I define some other variable as :[itex]Z=X+Y[/itex] and find the covariance from [itex]Var(Z)= Var(X)+Var(Y) +2 Cov(X,Y)[/itex] ?
    while [itex]Z[/itex] will be given by [itex]Z= (\mu_x+\mu_y) (1+ G_1 + G_2) [/itex]?

    Then [itex]Var(Z)= (\mu_x+ \mu_y)^2 (\sigma_1^2+ \sigma_2^2)[/itex]

    And [itex]Cov(X,Y) = \dfrac{(\mu_x+\mu_y)^2(\sigma_1^2+ \sigma_2^2)- \mu_x^2 (\sigma_1^2+ \sigma_2^2) - \mu_y^2(\sigma_1^2+ \sigma_2^2) }{2}=\mu_x \mu_y(\sigma_1^2+ \sigma_2^2) [/itex]

    Is my logic correct? I'm not sure about the Z and whether it's given by that formula.
     
    Last edited: Jun 10, 2015
  2. jcsd
  3. Jun 10, 2015 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Why not simply apply the definition of covariance? Or the reduced form E(XY)-E(X)E(Y)?
     
  4. Jun 11, 2015 #3

    ChrisVer

    User Avatar
    Gold Member

    I don't know E[XY]...?
     
  5. Jun 11, 2015 #4

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    But you know the distributions of ##X## and ##Y## and so you can compute it.
     
  6. Jun 11, 2015 #5

    ChrisVer

    User Avatar
    Gold Member

    I don't know the joint distribution function...
    so that [itex]E[xy]= \int dxdy~ h(x,y) xy[/itex]
    And I can't say [itex]h(x,y)=f(x)g(y)[/itex] since I don't know if X,Y are independent...if they were independent they wouldn't have a covariance too...the E[xy]=E[x]E[y] and your reduced form formula would vanish the cov.
     
  7. Jun 11, 2015 #6

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    You do. The only reason I can see to number the ##G##s is to underline out that they are independent distributions so that ##X## and ##Y## have a common part ##G_2## and one individual part ##G_1##/##G_3##.
     
  8. Jun 11, 2015 #7

    ChrisVer

    User Avatar
    Gold Member

    Yes that's the reason of labeling them... It just happens that ##G_1,G_2## have the same arguments ##\mu=0, \sigma_1## but they are not common for X,Y. However the [itex]G_3[/itex] is a common source of uncertainty in both X,Y... Still I don't understand how to get the joint probability from it... like taking the intersection of X,Y is leading to only the ##G_3##? that doesn't seem right either.
     
  9. Jun 11, 2015 #8

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    I suggest you start from the three-dimensional distribution for ##G_i##. From there you can integrate over regions of constant ##X## and ##Y## to obtain the joint pdf for ##X## and ##Y##. In reality, it does not even have to be that hard. Just consider ##X## and ##Y## as functions on the three dimensional outcome space the ##G_i## and use what you know about those, e.g., ##E(G_1G_2) = 0## etc.
     
  10. Jun 11, 2015 #9

    Stephen Tashi

    User Avatar
    Science Advisor

    To keep the discussion clear, you should use correct notation. If "[itex] X [/itex]" is a random variable then [itex] X [/itex] has a distribution, but it is not "equal" to its distribution. I think what you mean is that:

    [itex] X = \mu_x( X_1 + X_2) [/itex]
    [itex] Y = \mu_y( X_3 + X_2 )[/itex]

    where [itex] X_i [/itex] is a random variable with Gaussian distribution [itex] G(0,\sigma_i) [/itex].
     
  11. Jun 11, 2015 #10

    ChrisVer

    User Avatar
    Gold Member

    No, X is a random variable,taken from a distribution function with:
    mean ##\mu_x## (that's the role of 1)
    some measurment uncertainty following a guassian ##G_1## or ##G_3##.
    a further common measurement uncertainty ##G_3##.

    That means I could take 5 measurements from X :##\{x_1,x_2,x_3,x_4,x_5 \}=\mu_x+ \{ + 0.02, + 0.001, 0, - 0.01, - 0.06\}##.
     
  12. Jun 11, 2015 #11

    ChrisVer

    User Avatar
    Gold Member

    Ahhh OK I understood what you meant to say... yes it's fine, I understand what I wrote maybe I didn't write it in a correct way (confusing random variables with distributions)...
     
  13. Jun 11, 2015 #12

    Stephen Tashi

    User Avatar
    Science Advisor

    Apply the formula for [itex]\sigma(aX + bY, cW + dV) [/itex] given in the Wikipedia article on Covariance http://en.wikipedia.org/wiki/Covariance

    In your case , [itex] a = b = \mu_x,\ X=X_1,\ Y= X_2,\ c=d=\mu_y,\ W = X_3, V = X_2 [/itex]

    Edit: You'll have to put the constant 1 in somewhere. You could use X = 1 + X_1.
     
  14. Jun 11, 2015 #13

    ChrisVer

    User Avatar
    Gold Member

    I also thought about this:
    writing the covariance as [itex] \sigma_{XY} = \sigma_X \sigma_Y \rho[/itex] with [itex]\rho[/itex] the correlation coefficient.

    And since [itex]X= \mu_x (1+ X_1 + X_2)[/itex] and [itex]Y = \mu_y (1 + X_3 + X_2) [/itex] I think these two are linearly correlated (due to [itex]X_2[/itex]). So [itex]\rho>0[/itex]. Would you find this a logical statement? I mean if [itex]X_2[/itex] happens to be chosen a larger number, both [itex]X,Y[/itex] will get a larger number as contribution.
    For the value of [itex]\rho[/itex] I guess it should (by the same logic) be given by a combination of [itex]\mu_y,\mu_x[/itex], since they give the difference in how [itex]X,Y[/itex] change with [itex]X_2[/itex]'s change. I mean if [itex]\mu_x > \mu_y[/itex] then [itex]X[/itex] will get a larger contribution from the same [itex]X_2[/itex] than [itex]Y[/itex] and vice versa for [itex]\mu_x<\mu_y[/itex]...So I guess it should be an [itex]X(X_2)=\frac{\mu_x}{\mu_y} Y(X_2)[/itex]?
     
    Last edited: Jun 11, 2015
  15. Jun 11, 2015 #14

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    I still think you are overcomplicating things. Do you know how to compute ##E(X_iX_j)## when ##X_i## has a Gaussian distribution with mean zero?
     
  16. Jun 11, 2015 #15

    ChrisVer

    User Avatar
    Gold Member

    In general I'd know, but as I said I have difficulty in finding the joint pdf...

    Are you saying that [itex]\mu_i (1+G_2)[/itex] in my notation is the joint pdf (after integrating out G1 and G3)?
     
  17. Jun 11, 2015 #16

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    You do not need to find the joint pdf. You can work directly with the Gaussians!
     
  18. Jun 11, 2015 #17

    ChrisVer

    User Avatar
    Gold Member

    Then in that case I don't know how to find ##E[X_i X_j]##....
    The formula I know defines the expectation value through an integral with the pdf...:sorry:
     
  19. Jun 11, 2015 #18

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    That one is simple, it is the expectation value of two independent gaussians with zero mean. Since they are independent, the pdf factorises ...

    Edit: ... unless, of course, if i = j ...
     
  20. Jun 11, 2015 #19

    ChrisVer

    User Avatar
    Gold Member

    So you suggest something like:
    [itex]E[X_i X_j] = \begin{cases} E[X_i]E[X_j] = \mu_i \mu_j & i \ne j \\ E[X_i^2]= \sigma_i^2 + \mu_i^2 & i=j \end{cases}[/itex]

    Where [itex]X_i[/itex] gaussian distributed variables
     
  21. Jun 11, 2015 #20

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Indeed.

    Edit: Of course, you now have ##E[(1+X_1+X_2)(1+X_3+X_2)]## so you will have to do some algebra, but not a big deal.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Problem with calculating the cov matrix of X,Y
  1. Var(X) = Cov(X,X) ? (Replies: 5)

Loading...