Expectations on the product of two dependent random variables

Click For Summary

Discussion Overview

The discussion revolves around the expectations of the product of two dependent random variables, specifically focusing on deriving a proof that incorporates the covariance between the variables. Participants explore the implications of dependence versus independence in expectation calculations, as well as generalizations to multiple random variables.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant seeks assistance in proving the expectation of the product of two dependent random variables, suggesting a breakdown using Bayesian conditional probability.
  • Another participant notes that for independent variables, the expectation of their product equals the product of their expectations, hinting at a more complex situation for dependent variables.
  • A participant proposes that the expectation involving dependent variables should include covariance, suggesting a relationship with the joint probability distribution function.
  • A formula for the covariance is presented, showing how it relates to the expectation of the product of the two variables.
  • There is a query regarding the generalization of the problem to three random variables, with a suggestion that a formula involving covariances might exist.
  • A later reply challenges the idea of a simple covariance-based formula for three variables, stating that higher-order moments must be included for accurate representation.

Areas of Agreement / Disagreement

Participants express differing views on the treatment of dependent versus independent variables in expectation calculations. There is no consensus on the generalization to multiple random variables, with conflicting opinions on the necessity of including higher-order moments.

Contextual Notes

Participants mention the need for conditional probability functions and joint distributions, but the discussion does not resolve the assumptions or definitions required for these concepts.

Who May Find This Useful

Individuals studying for financial risk management (FRM) exams, statisticians, and those interested in probability theory and its applications in finance and statistics may find this discussion relevant.

simonkmtse
Messages
2
Reaction score
0
I am studying for the FRM and there is a question concerning the captioned. I try to start off by following the standard Expectation calculation and breakdown the pdf into Bayesian Conditional Probability function. Then i got stuck there. Anyone can help me to find a proof on it? Many thanks.
 
Physics news on Phys.org
If the random variables X, Y are independent then

<br /> E[X \cdot Y] = E[X] \cdot E[Y]<br />

I sense from the tone of your question something more is involved?
 
Thanks Statdad.

But I want to work out a proof of Expectation that involves two dependent variables, i.e. X and Y, such that the final expression would involve the E(X), E(Y) and Cov(X,Y).

I suspect it has to do with the Joint Probability distribution function and somehow I need to separate this function into a composite one that invovles two single-variate Probability distribution and one that involves correlation coefficient.

I just can't get beyond that step.
 
Sorry - I'm not sure how I did it, but when I first read your message I apparently saw, or thought I saw, a reference to independence.
 
No need to look at conditional PDFs. We have that:

<br /> Cov[X,Y] = E[(X-E[X])\cdot(Y-E[Y])]
<br /> = E[X \cdot Y] - E[X \cdot E[Y]] - E[E[X] \cdot Y] + E[E[X] \cdot E[Y]]
= E[X \cdot Y] - E[X] \cdot E[Y]

Thus,

E[X \cdot Y] = Cov[X,Y] + E[X] \cdot E[Y]

Cheers,

-Emanuel
 
Is anybody familiar with how this problem generalizes to multiple random variables? As a steppingstone, is there a formula for three random variables X, Y, and Z such that:

E[XYZ] = E[X] * E[Y] * E[Z] + [term involving covariances]

Thanks for your help!
 
commish said:
Is anybody familiar with how this problem generalizes to multiple random variables? As a steppingstone, is there a formula for three random variables X, Y, and Z such that:

E[XYZ] = E[X] * E[Y] * E[Z] + [term involving covariances]

Thanks for your help!

There is no such formula involving just covariances, you have to include higher order moments such as E[(X-E[X]) \cdot (Y-E[Y]) \cdot (Z-E[Z])] for a 3-variable case.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K