Joint expectation of two functions of a random variable

In summary: Thus, {f_{xy}}(x,y) = {f_\theta }(\Theta ) is a valid assumption. In summary, the random variables X and Y defined by X = cos(theta) and Y = sin(theta), where theta is a uniform random variable distributed over (0,2pi), are uncorrelated. The joint pdf {f_{xy}}(x,y) does not technically exist, but can be expressed in terms of the marginal pdf of theta.
  • #1
dionysian
53
1
Ok I am not sure if I should put this question in the homework category of here but it’s a problem from schaums outline and I know the solution to it but I don’t understand the solution 100% so maybe someone can explain this to me.
Let [tex]X[/tex] and [tex]Y[/tex] be defined by:
[tex]\begin{array}{l}
X = \cos \theta \\
Y = \sin \theta \\
\end{array}[/tex]
Where [tex]\theta [/tex] is a uniform random variable distributed over [tex](0,2\pi )[/tex]
A) Show that [tex]X[/tex] and [tex]Y[/tex] are uncorrelated
Attempt at solution:
Show [tex]{\mathop{\rm cov}} (x,y) = 0[/tex]
[tex]\begin{array}{l}
{\mathop{\rm cov}} (x,y) = E[xy] - E[x]E[y] \\
E[xy] = \int\limits_0^{2\pi } {\int\limits_0^{2\pi } {xy{f_{xy}}(x,y)dxdy} } \\
\end{array}[/tex]
Now my question is how do we determine the joint pdf [tex]{{f_{xy}}(x,y)}[/tex] if we only know the marginal pdfs of [tex]\theta [/tex]?
In the solution to the problem its seems that they assume that
[tex]{f_{xy}}(x,y) = {f_\theta }(\Theta )[/tex]
Then the integeral they use becomes
[tex]E[xy] = \int\limits_0^{2\pi } {xy{f_\theta }(\Theta )d\theta } [/tex]
But how come it is valid to assume that
[tex]{f_{xy}}(x,y) = {f_\theta }(\Theta )[/tex]
Doesn’t the joint (and the marginal) pdf change because of the functions:
[tex]\begin{array}{l}
X = \cos \theta \\
Y = \sin \theta \\
\end{array}[/tex]
I f anyone knows what I am trying to ask please give me a little help to what is going on here.
 
Physics news on Phys.org
  • #2
dionysian said:
... Now my question is how do we determine the joint pdf [tex]{{f_{xy}}(x,y)}[/tex] if we only know the marginal pdfs of [tex]\theta [/tex]?

The joint pdf doesn't technically exist, because the random variables (X,Y) have all their mass on a 1-dimensional subset of 2d space, i.e. the circle (cos(theta),sin(theta)). The joint pdf could be written in terms of Dirac delta functions or the expectation could be written as a Stieltjes integral (using the joint cdf) but the theory gets messy and for this example it's much simpler to write the expectation as

[tex]E[XY] = E[cos(\theta)sin(\theta)][/tex]

which can be expressed as a single integral because theta has a pdf.
 

1. What is the definition of joint expectation of two functions of a random variable?

The joint expectation of two functions of a random variable is the expected value of the product of the two functions, where the random variable follows a specific probability distribution. It is denoted as E[f(X)g(Y)].

2. How is joint expectation calculated?

Joint expectation is calculated by integrating the product of the two functions over the joint probability distribution of the random variables. This can be represented mathematically as E[f(X)g(Y)] = ∫∫ f(x)g(y) p(x,y) dx dy, where p(x,y) is the joint probability distribution function of X and Y.

3. What is the difference between joint expectation and individual expectations?

Joint expectation involves the product of two functions, while individual expectations involve the expected value of each function separately. Joint expectation takes into account the correlation between the two functions, while individual expectations do not.

4. Can joint expectation be used to calculate the expected value of a single function of a random variable?

Yes, joint expectation can be used to calculate the expected value of a single function of a random variable by setting one of the functions to be a constant. This will result in the expected value of the other function, which is equivalent to the expected value of the single function.

5. In what situations is joint expectation commonly used?

Joint expectation is commonly used in the fields of statistics, probability, and data analysis. It is used to calculate the expected value of the product of two random variables, which can provide insights into the relationship between the variables and their overall impact on a system or outcome.

Similar threads

  • Set Theory, Logic, Probability, Statistics
2
Replies
43
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
951
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
425
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
876
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
Back
Top