Joint expectation of two functions of a random variable

dionysian
Messages
51
Reaction score
1
Ok I am not sure if I should put this question in the homework category of here but it’s a problem from schaums outline and I know the solution to it but I don’t understand the solution 100% so maybe someone can explain this to me.
Let X and Y be defined by:
\begin{array}{l}<br /> X = \cos \theta \\ <br /> Y = \sin \theta \\ <br /> \end{array}
Where \theta is a uniform random variable distributed over (0,2\pi )
A) Show that X and Y are uncorrelated
Attempt at solution:
Show {\mathop{\rm cov}} (x,y) = 0
\begin{array}{l}<br /> {\mathop{\rm cov}} (x,y) = E[xy] - E[x]E[y] \\ <br /> E[xy] = \int\limits_0^{2\pi } {\int\limits_0^{2\pi } {xy{f_{xy}}(x,y)dxdy} } \\ <br /> \end{array}
Now my question is how do we determine the joint pdf {{f_{xy}}(x,y)} if we only know the marginal pdfs of \theta?
In the solution to the problem its seems that they assume that
{f_{xy}}(x,y) = {f_\theta }(\Theta )
Then the integeral they use becomes
E[xy] = \int\limits_0^{2\pi } {xy{f_\theta }(\Theta )d\theta }
But how come it is valid to assume that
{f_{xy}}(x,y) = {f_\theta }(\Theta )
Doesn’t the joint (and the marginal) pdf change because of the functions:
\begin{array}{l}<br /> X = \cos \theta \\ <br /> Y = \sin \theta \\ <br /> \end{array}
I f anyone knows what I am trying to ask please give me a little help to what is going on here.
 
Physics news on Phys.org
dionysian said:
... Now my question is how do we determine the joint pdf {{f_{xy}}(x,y)} if we only know the marginal pdfs of \theta?

The joint pdf doesn't technically exist, because the random variables (X,Y) have all their mass on a 1-dimensional subset of 2d space, i.e. the circle (cos(theta),sin(theta)). The joint pdf could be written in terms of Dirac delta functions or the expectation could be written as a Stieltjes integral (using the joint cdf) but the theory gets messy and for this example it's much simpler to write the expectation as

E[XY] = E[cos(\theta)sin(\theta)]

which can be expressed as a single integral because theta has a pdf.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...

Similar threads

Back
Top