Expectations on the product of two dependent random variables

In summary, the conversation discusses a question about the Expectation calculation for two dependent variables and how it involves the Joint Probability distribution function and covariance. The conversation also mentions the generalization of this problem to multiple random variables and the inclusion of higher order moments in the formula.
  • #1
simonkmtse
2
0
I am studying for the FRM and there is a question concerning the captioned. I try to start off by following the standard Expectation calculation and breakdown the pdf into Bayesian Conditional Probability function. Then i got stuck there. Anyone can help me to find a proof on it? Many thanks.
 
Physics news on Phys.org
  • #2
If the random variables [tex] X, Y [/tex] are independent then

[tex]
E[X \cdot Y] = E[X] \cdot E[Y]
[/tex]

I sense from the tone of your question something more is involved?
 
  • #3
Thanks Statdad.

But I want to work out a proof of Expectation that involves two dependent variables, i.e. X and Y, such that the final expression would involve the E(X), E(Y) and Cov(X,Y).

I suspect it has to do with the Joint Probability distribution function and somehow I need to separate this function into a composite one that invovles two single-variate Probability distribution and one that involves correlation coefficient.

I just can't get beyond that step.
 
  • #4
Sorry - I'm not sure how I did it, but when I first read your message I apparently saw, or thought I saw, a reference to independence.
 
  • #5
No need to look at conditional PDFs. We have that:

[tex]
Cov[X,Y] = E[(X-E[X])\cdot(Y-E[Y])] [/tex]
[tex]
= E[X \cdot Y] - E[X \cdot E[Y]] - E[E[X] \cdot Y] + E[E[X] \cdot E[Y]] [/tex]
[tex] = E[X \cdot Y] - E[X] \cdot E[Y] [/tex]

Thus,

[tex]E[X \cdot Y] = Cov[X,Y] + E[X] \cdot E[Y] [/tex]

Cheers,

-Emanuel
 
  • #6
Is anybody familiar with how this problem generalizes to multiple random variables? As a steppingstone, is there a formula for three random variables X, Y, and Z such that:

E[XYZ] = E[X] * E[Y] * E[Z] + [term involving covariances]

Thanks for your help!
 
  • #7
commish said:
Is anybody familiar with how this problem generalizes to multiple random variables? As a steppingstone, is there a formula for three random variables X, Y, and Z such that:

E[XYZ] = E[X] * E[Y] * E[Z] + [term involving covariances]

Thanks for your help!

There is no such formula involving just covariances, you have to include higher order moments such as [itex]E[(X-E[X]) \cdot (Y-E[Y]) \cdot (Z-E[Z])] [/itex] for a 3-variable case.
 

1. What are dependent random variables?

Dependent random variables are variables that are related to each other in some way, meaning the outcome of one variable can affect the outcome of the other. This relationship can be positive, negative, or nonlinear.

2. How do you calculate the expected value of two dependent random variables?

The expected value of two dependent random variables can be calculated by multiplying the two variables together and then taking the expected value of the resulting product. This is known as the product rule for expected values.

3. Can two independent random variables have a product that is dependent?

No, by definition, independent random variables are not related to each other and therefore cannot have a dependent product. However, two independent variables can have a correlated product.

4. How does the correlation between two dependent random variables affect their product?

The correlation between two dependent random variables can affect their product in a few ways. If the correlation is positive, then the product will tend to be larger than if the correlation was negative. Additionally, a higher correlation can lead to a higher probability of obtaining extreme values for the product.

5. Are there any assumptions that need to be made when calculating the expected value of two dependent random variables?

Yes, there are a few assumptions that need to be made when calculating the expected value of two dependent random variables. These include the assumption that the variables are normally distributed, the relationship between them is linear, and there are no outliers or influential points in the data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
438
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
84
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
673
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
884
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
Back
Top