Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Help me in conditio expectation

  1. Jan 21, 2008 #1
    Help me in conditional expectation

    Hi all..
    I read one article couple days ago, yet, there is some equations that I could not understand.

    let assume that y = u + v

    where u is normally distributed with mean = 0 and variance = s -> u ~ N (0, s)

    and v is normally distributed with mean = 0 and variance = t -> v ~ N (0, t)

    thus, the author wrote that :
    E (v_{i}|y_{i})= (t *y_{i})/(s+t)

    I tried to find how he derived this conditional expectation.

    E (v_{i}|y_{i})= Integral of x*Pr(v_{i}|y_{i}) dx

    Then calculate Pr(v_{i}|y_{i}) using bayes rules.

    However, it seems that I couldnot get the same answer as mentioned by the author in that article E (v_{i}|y_{i})= (t *y_{i})/(s+t)
    Could some one please help me on this matter or show me the way to get the same result as the the author

    Thank you..:)

    ps : 1. v_{i} is v subscript i.
    2. I tried to write using the math simbols (using LaTeX ref) but the results did not look good. That's why I used current style in this question. (i am very sorry for this)
    Last edited: Jan 21, 2008
  2. jcsd
  3. Jan 24, 2008 #2


    User Avatar
    Science Advisor
    Homework Helper

    The most direct answer I can come up with is the following.

    E[y|y] = E[v|y] + E[u|y]

    Clearly E[y|y] = y so y = E[v|y] + E[u|y].

    I hypothesize that each of E[v|y] and E[u|y] is a linear function of y: E[v|y] = ay and E[u|y] = (1-a)y for some a in [0,1].

    The question now becomes, why should anyone expect a = t/(s+t)?

    If I were estimating a from a sample using least squares, t/(s+t) is exactly the formula I would have used. To see this, let e be the least squares estimator of a in the following equation: v = ay + bu. Then, e = Cov[v, y]/Var[y]. I know Var[y] = s+t. But I need to derive Cov[v, y].

    To derive it, write out y as v+u and use the covariance formula for sums of random variables (which is very similar to an inner product): Cov[v, v+u] = Cov[v,v] + Cov[v,u] = Var[v] + 0 = t.

    Therefore e = t/(s+t). Since linear squares is unbiased, E[e] = a. When t and s are assumed known (as in the problem), a = E[e|s, t] = t/(s+t).

    Next, I can guess (and verify) that b must be zero. So I can conclude that E[v|y, s, t] = E[e|s, t]y = t/(s+t) y.

    The final question is, why does this make sense? I have two answers:

    1. The least squares estimator is the most informationally efficient estimator within the class of linear unbiased estimators. Any other estimator would have wasted some of the information used by the least squares estimator.

    2. t/(s+t) is the part of total variance in y attributable to v. Given any y, the expected value of v is the part of y that v is (on average) responsible for, which is t/(s+t) times y.
    Last edited: Jan 24, 2008
  4. Jan 25, 2008 #3


    User Avatar

    I think the easiest method is to use the fact that the covariance of y and sv-tu is zero, which implies that they are independent (by a general result for joint normal rvs). Write v=(sv-tu)/(s+t) + ty/(s+t) and the result should become clear.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook