Expected value for the product of three dependent RV

commish
Messages
3
Reaction score
0
Hi,

I want to derive an expression to compute the expected value for the product of three (potentially) dependent RV. In a separate thread, winterfors provided the manipulation at the bottom to arrive at such an expression for two RV.

Does anybody have any guidance on how I can take this a step further to establish a like expression for the product of THREE dependent RV? In other words, given three RV called X, Y, and Z, I want to derive:

E[X \cdot Y \cdot Z] = ...


Thanks!


winterfors said:
No need to look at conditional PDFs. We have that:

<br /> Cov[X,Y] = E[(X-E[X])\cdot(Y-E[Y])]
<br /> = E[X \cdot Y] - E[X \cdot E[Y]] - E[E[X] \cdot Y] + E[E[X] \cdot E[Y]]
= E[X \cdot Y] - E[X] \cdot E[Y]

Thus,

E[X \cdot Y] = Cov[X,Y] + E[X] \cdot E[Y]
 
Last edited:
Physics news on Phys.org
Expand E[V.Z]. Then substitute X.Y for V.
 
I'm not sure if I interpreted you correctly. First, I expanded the original equation:

<br /> E(XV) &amp;=&amp; COV(X, V) + E(X) \cdot E(V) \\<br />

to get

<br /> E(XV) &amp;=&amp; E[(X-E(X)) \cdot (V-E(V))] + E(X) \cdot E(V) \\<br />.

From here, I substituted YZ for V:

<br /> E(XYZ)&amp;=&amp; E[(X-E(X)) \cdot (YZ-E(YZ))] + E(X) \cdot E(YZ) \\<br />

<br /> E(XYZ) &amp;=&amp; E[XYZ- X E(YZ) - E(X) YZ + E(X) E(YZ)]+ E(X) E(YZ) <br />,

which then reduces to:

<br /> E(XYZ) &amp;=&amp; E(XYZ) - (2 \cdot E(X) E(YZ)) + (2 \cdot E(X)E(YZ))<br />.

And thus,

<br /> E(XYZ) &amp;=&amp; E(XYZ)<br />.

This makes sense, but it gets me no closer to deriving an expression for E(XYZ) without having an E(XYZ) term on the RHS.

Any help would be appreciated. Thanks!
 
When v = y z, C[x, v] - E[x] E[v] = C[x, y z] - E[x] E[y z] = C[x, y z] - E[x] (C[y, z] - E[y] E[z]) (last equality follows from E[y z] = C[y, z] - E[y] E[z]).

I can write C[x, y z] as C[1 x, y z]. In the following formula make the substitutions x = 1, y = x, u = y, v = z:

C[xy, uv] = Ex Eu C[y, u] + Ex Ev C[y, u] + Ey Eu C[x, v] + Ey Ev C[x, u]
+ E[(Dx)(Dy)(Du)(Dv)] + Ex E[(Dy)(Du)(Dv)] + Ey E[(Dx)(Du)(Dv)]
+ Eu E[(Dx)(Dy)(Dv)] + Ev E[(Dx)(Dy)(Du)] - C[x, y]C[u, v]

where Dx = x - Ex, etc., and Ex is a shorthand for E[x].

See:

Bohrnstedt, G. W., and A. S. Goldberger. 1969. On the exact covariance of products of random variables. Journal of the American Statistical Association 64: 1439–1442.

GRAY, GERRY. 1999. Covariances in Multiplicative Estimates. Transactions of the American Fisheries Society 128: 475–482.

Goodman, L. A. 1960. On the exact variance of products. Journal of the American Statistical Association 55: 708–713.
 
I appreciate the information given in the previous responses. I am looking at the analysis of turbulent flow data and this topic is extremely relevant. I have read and worked through the references posted. My main concern is that in order to derive the results of Bohrnstedt(?) (sorry) you seem to have to make the same assumptions for pairwise independent random variables as for normally distributed variables.
For random variables with zero expected value the result given in Bohrnstedt and Goldberger seems to imply that E[dx,dy,du,dv] can be factorized, even if the random variables are not normally distributed. The same result, in a somewhat different form, is quoted in Gerry Gray's paper. It would be very helpful to understand how to represent the covariance of products of four random variables when only some of them are independent and they are not normally distributed. Any references or textbooks would be helpful. Thanks.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top