# Expectations on the product of two dependent random variables

1. Dec 1, 2008

### simonkmtse

I am studying for the FRM and there is a question concerning the captioned. I try to start off by following the standard Expectation calculation and breakdown the pdf into Bayesian Conditional Probability function. Then i got stuck there. Anyone can help me to find a proof on it? Many thanks.

2. Dec 4, 2008

If the random variables $$X, Y$$ are independent then

$$E[X \cdot Y] = E[X] \cdot E[Y]$$

I sense from the tone of your question something more is involved?

3. Dec 4, 2008

### simonkmtse

But I wanna work out a proof of Expectation that involves two dependent variables, i.e. X and Y, such that the final expression would involve the E(X), E(Y) and Cov(X,Y).

I suspect it has to do with the Joint Probability distribution function and somehow I need to separate this function into a composite one that invovles two single-variate Probability distribution and one that involves correlation coefficient.

I just can't get beyond that step.

4. Dec 5, 2008

Sorry - I'm not sure how I did it, but when I first read your message I apparently saw, or thought I saw, a reference to independence.

5. Dec 5, 2008

### winterfors

No need to look at conditional PDFs. We have that:

$$Cov[X,Y] = E[(X-E[X])\cdot(Y-E[Y])]$$
$$= E[X \cdot Y] - E[X \cdot E[Y]] - E[E[X] \cdot Y] + E[E[X] \cdot E[Y]]$$
$$= E[X \cdot Y] - E[X] \cdot E[Y]$$

Thus,

$$E[X \cdot Y] = Cov[X,Y] + E[X] \cdot E[Y]$$

Cheers,

-Emanuel

6. Dec 10, 2009

### commish

Is anybody familiar with how this problem generalizes to multiple random variables? As a steppingstone, is there a formula for three random variables X, Y, and Z such that:

E[XYZ] = E[X] * E[Y] * E[Z] + [term involving covariances]

There is no such formula involving just covariances, you have to include higher order moments such as $E[(X-E[X]) \cdot (Y-E[Y]) \cdot (Z-E[Z])]$ for a 3-variable case.