Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Help on Covariance

  1. Jun 15, 2011 #1
    I need to find an approximation of the covariance of a function of a random variable.

    [itex]\Theta[/itex]1- log[p1/(1-p1)] where p1 is binomial
    [itex]\Theta[/itex]2- log[p2/(1-p2)] where p2 is binomial

    I need to find the covariance of [itex]\Theta1[/itex] and [itex]\Theta2[/itex]

    Please- any help will be greatly appreciated
     
  2. jcsd
  3. Jun 15, 2011 #2
    Re: HELP Covariance

    Without data, the covariance of two random variables can be expressed as an inequality.

    [tex]|Cov(X,Y)|\leq \sqrt{Var(X)Var(Y)}[/tex]

    Now what is the variance of X and Y if they have a binomial distribution?
     
    Last edited: Jun 16, 2011
  4. Jun 16, 2011 #3

    mathman

    User Avatar
    Science Advisor
    Gold Member

    Re: HELP Covariance

    If the p's are independent, the covariance is 0, since the Θ's are independent.
     
  5. Jun 17, 2011 #4
    Re: HELP Covariance

    What I am looking for really is an expression for the covariance if I have a function of the random variable instead of only a random variable. The function is the log function above in the original question. I know there is such an expression for the variance of a function of the random variable and the result involves 1st derivative of that function, but I confused how that works for covariance.

    Thanks
     
  6. Jun 17, 2011 #5

    Stephen Tashi

    User Avatar
    Science Advisor

    Re: HELP Covariance

    How are those formulas going to make sense if the p's are binomial random variables? They will take values from the set 0,1,2,...N. Suppose p1 = 3. Won't you be trying to take the log of a negative number?

    My guess at what your trying to say is this:

    Let X and Y be random variables. Let f and g be functions of a single variable and let the random variables F and G be defined as F= f(X) and G = g(Y). What is a method for expressing COV(F,G) in terms of COV(X,Y) ?

    An even more general question is:

    Let X and Y be random variables. Let f and g be functions of two variables and let the random variables F and G be defined as F= f(X,Y) and G = g(X,Y). What is a method for expressing COV(F,G) in terms of COV(X,Y) ?

    I don't claim to know the answer to that question. One thought is to use power series expansions.

    Let [itex] M = E( F^k G^j) [/itex] be a moment of the joint distribution of [itex] (F,G) [/itex]. Expand the function [itex] F^k G^j [/itex] as a power series in X and Y. Then the expectation [itex] E( F^k G^j) [/itex] becomes the sum of expectations like [itex] E( C X^r Y^s) [/itex] where [itex] C [/itex] is a constant that does involves evaluations of partial derivatives of [itex] f [/itex] and [itex] g [/itex].

    The particular case of COV(F,G) involves the particular moments E(F), E(G) and E(FG).
    We can work out what the power series method says in that case. (Or if we are lucky, some other keen forum member will do it for us!)
     
  7. Jun 17, 2011 #6
    Yes
    If X and Y are random variables and H(X) and G(Y) are functions of those random variables then what will be an expression of the COV(H(X), G(Y)) in terms of X and Y.

    Similar expression is Var[H(X)] equals approximately [H'(mean)]^2 * Var(X)

    Appreciate any help
    Thank you :-)
     
  8. Jun 18, 2011 #7
    Sorry,one more clarification the p's are between 0 and 1, so the log will not be negative.
     
  9. Jun 19, 2011 #8

    Stephen Tashi

    User Avatar
    Science Advisor

    What you've said about the [itex] \Theta_i [/itex] still isn't consistent. Are the [itex] p_i [/itex] random variables? - or are they parameters?. If they are between 0 and 1, they can't be binomial random variables.

    -------------------

    Let's try to do the abstract problem : For random variables x and y and random variables defined by F = f(x) and G = g(y), give an approximation for Cov(F,G) in terms of statistics involving x and y.

    ( What's the world coming to when a person has to try to work out his own suggestions? We need someone skilled with a computer algebra program.)

    See if this looks correct:


    [tex] H(x,y) = F(x)G(y) [/tex]

    [tex] H(x,y) \approx H + \frac{\partial H}{\partial x}(x-\mu_x) + \frac{\partial H}{\partial y} (y - \mu_y) + \frac{\partial^2 H}{\partial x \partial y} (x-\mu_x)(y-\mu_y) + \frac{1}{2} \frac{\partial^2 H}{\partial x^2}(x-\mu_x)^2 + \frac{1}{2} \frac{\partial^2 H}{\partial y^2} (y-\mu_y)^2 [/tex]

    Where [itex] H [/itex] and its derivatives are evaluated at the point [itex] (\mu_x,\mu_y) [/itex].

    Take expectations with respect to the joint density of X and Y.

    [tex] E(H) = H + 0 + 0 + Cov(x,y) \frac{\partial^2 H}{\partial x \partial y}+ \frac{\sigma^2_x}{2} \frac{\partial^2 H}{\partial x^2} + \frac{\sigma^2_y }{2}\frac{\partial^2H}{\partial y^2}[/tex]


    Also use the approximations:

    [tex] F(x) \approx F + (F')(x - \mu_x) + \frac{ (F'')(x - \mu_x)^2} {2} [/tex]
    [tex] G(x) \approx G + (G')(x - \mu_y) + \frac{ (G'')(x - \mu_y)^2} {2} [/tex]


    where F and its derivatives are evaluated at [itex] \mu_x [/itex] and G and its derivatives are evaluated at [itex] \mu_y [/itex].

    So

    [tex] E(F(x)) \approx F + 0 + \frac{\sigma^2_x}{2}(F'') [/tex]
    [tex] E(G(x)) \approx G + 0 + \frac{\sigma^2_y}{2}(G'') [/tex]

    [tex] Cov(F,G) = E(FG) - E(F)E(G) [/tex]
    [tex] \approx H + Cov(x,y) \frac{\partial^2 H}{\partial x \partial y}+ \frac{\sigma^2_x}{2} \frac{\partial^2 H}{\partial x^2} + \frac{\sigma^2_y }{2}\frac{\partial^2H}{\partial y^2} - (F + \frac{\sigma^2_x}{2}(F''))(G + \frac{\sigma^2_y}{2}(G'')) [/tex]

    If that's correct, the next step would be to write terms like [itex] \frac{\partial^2 H}{\partial x \partial y} [/itex] in terms of derivatives of [itex] F [/itex] and [itex] G [/itex].
     
  10. Jun 20, 2011 #9
    First, note we are dealing with odds, not probabilities. The log(odds) ranges from negative to positive infinity. When p=0.5, the odds=1 and the log odds=0.

    [tex] \frac{p_1/1-p_1}{p_2/1-p_2}= \frac{p_1/q_1}{p_2/q_2}=\frac{p_1 q_2}{p_2 q_1}[/tex]

    The last term is the odds ratio expressed as the cross products. In log form it's [tex](ln(p_1)+ln(q_2))-(ln(p_2)+ln(q_1))[/tex]

    When the odds ratio (OR)is unity (lnOR=0), the two odds functions (logits) are independent and the covariance is therefore zero.
     
    Last edited: Jun 20, 2011
  11. Jun 21, 2011 #10
    This link will give more detail (see section 5 re covariance). Your question regarding covariance requires some understanding of the use of odds ratios as measures of association to answer fully. You can download the full PDF from the linked page.

    http://arxiv.org/abs/1105.0852
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Help on Covariance
  1. Covariance ellipse (Replies: 0)

  2. Covariance matrix (Replies: 2)

  3. Covariance matrix (Replies: 4)

  4. Covariance Proof (Replies: 1)

  5. Covariance inequality (Replies: 0)

Loading...