Expected value for the product of three dependent RV

Click For Summary
SUMMARY

The discussion focuses on deriving the expected value for the product of three dependent random variables (RVs), specifically denoted as E[X · Y · Z]. The user seeks guidance on extending existing formulas for two RVs to three, referencing the covariance formula provided by winterfors. The conversation highlights the complexity of establishing a definitive expression for E[X · Y · Z] without relying on the term itself on the right-hand side of the equation. Key references include works by Bohrnstedt and Goldberger, and Gerry Gray, which discuss covariance in products of random variables.

PREREQUISITES
  • Understanding of covariance and its mathematical representation
  • Familiarity with expected value calculations for random variables
  • Knowledge of statistical properties of dependent random variables
  • Basic proficiency in statistical literature and research methods
NEXT STEPS
  • Study the derivation of E[X · Y · Z] using covariance formulas
  • Explore the implications of Bohrnstedt and Goldberger's work on covariance
  • Investigate the treatment of non-normally distributed random variables in statistical analysis
  • Review advanced statistical texts on the covariance of products of random variables
USEFUL FOR

Statisticians, data analysts, and researchers working with dependent random variables, particularly in fields such as economics, engineering, and environmental science, will benefit from this discussion.

commish
Messages
3
Reaction score
0
Hi,

I want to derive an expression to compute the expected value for the product of three (potentially) dependent RV. In a separate thread, winterfors provided the manipulation at the bottom to arrive at such an expression for two RV.

Does anybody have any guidance on how I can take this a step further to establish a like expression for the product of THREE dependent RV? In other words, given three RV called X, Y, and Z, I want to derive:

E[X \cdot Y \cdot Z] = ...


Thanks!


winterfors said:
No need to look at conditional PDFs. We have that:

<br /> Cov[X,Y] = E[(X-E[X])\cdot(Y-E[Y])]
<br /> = E[X \cdot Y] - E[X \cdot E[Y]] - E[E[X] \cdot Y] + E[E[X] \cdot E[Y]]
= E[X \cdot Y] - E[X] \cdot E[Y]

Thus,

E[X \cdot Y] = Cov[X,Y] + E[X] \cdot E[Y]
 
Last edited:
Physics news on Phys.org
Expand E[V.Z]. Then substitute X.Y for V.
 
I'm not sure if I interpreted you correctly. First, I expanded the original equation:

<br /> E(XV) &amp;=&amp; COV(X, V) + E(X) \cdot E(V) \\<br />

to get

<br /> E(XV) &amp;=&amp; E[(X-E(X)) \cdot (V-E(V))] + E(X) \cdot E(V) \\<br />.

From here, I substituted YZ for V:

<br /> E(XYZ)&amp;=&amp; E[(X-E(X)) \cdot (YZ-E(YZ))] + E(X) \cdot E(YZ) \\<br />

<br /> E(XYZ) &amp;=&amp; E[XYZ- X E(YZ) - E(X) YZ + E(X) E(YZ)]+ E(X) E(YZ) <br />,

which then reduces to:

<br /> E(XYZ) &amp;=&amp; E(XYZ) - (2 \cdot E(X) E(YZ)) + (2 \cdot E(X)E(YZ))<br />.

And thus,

<br /> E(XYZ) &amp;=&amp; E(XYZ)<br />.

This makes sense, but it gets me no closer to deriving an expression for E(XYZ) without having an E(XYZ) term on the RHS.

Any help would be appreciated. Thanks!
 
When v = y z, C[x, v] - E[x] E[v] = C[x, y z] - E[x] E[y z] = C[x, y z] - E[x] (C[y, z] - E[y] E[z]) (last equality follows from E[y z] = C[y, z] - E[y] E[z]).

I can write C[x, y z] as C[1 x, y z]. In the following formula make the substitutions x = 1, y = x, u = y, v = z:

C[xy, uv] = Ex Eu C[y, u] + Ex Ev C[y, u] + Ey Eu C[x, v] + Ey Ev C[x, u]
+ E[(Dx)(Dy)(Du)(Dv)] + Ex E[(Dy)(Du)(Dv)] + Ey E[(Dx)(Du)(Dv)]
+ Eu E[(Dx)(Dy)(Dv)] + Ev E[(Dx)(Dy)(Du)] - C[x, y]C[u, v]

where Dx = x - Ex, etc., and Ex is a shorthand for E[x].

See:

Bohrnstedt, G. W., and A. S. Goldberger. 1969. On the exact covariance of products of random variables. Journal of the American Statistical Association 64: 1439–1442.

GRAY, GERRY. 1999. Covariances in Multiplicative Estimates. Transactions of the American Fisheries Society 128: 475–482.

Goodman, L. A. 1960. On the exact variance of products. Journal of the American Statistical Association 55: 708–713.
 
I appreciate the information given in the previous responses. I am looking at the analysis of turbulent flow data and this topic is extremely relevant. I have read and worked through the references posted. My main concern is that in order to derive the results of Bohrnstedt(?) (sorry) you seem to have to make the same assumptions for pairwise independent random variables as for normally distributed variables.
For random variables with zero expected value the result given in Bohrnstedt and Goldberger seems to imply that E[dx,dy,du,dv] can be factorized, even if the random variables are not normally distributed. The same result, in a somewhat different form, is quoted in Gerry Gray's paper. It would be very helpful to understand how to represent the covariance of products of four random variables when only some of them are independent and they are not normally distributed. Any references or textbooks would be helpful. Thanks.
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 26 ·
Replies
26
Views
893
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
5K
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K