Proof Regarding Functions of Independent Random Variables

Click For Summary
To prove that g(X) and h(Y) are independent functions of independent random variables X and Y, one can start by using the definition of independence in terms of probabilities. The key is to show that P(g(X) ε A, h(Y) ε B) equals P(g(X) ε A)P(h(Y) ε B) for any subsets A and B. The discussion highlights the importance of expectations and suggests that while covariance of zero does not imply independence, the multivariate characteristic function can be a useful tool in the proof. It is clarified that f and g do not need to be invertible for the proof to hold. Understanding the relationship between the joint and marginal distributions is essential for establishing the independence of the transformed variables.
SpringPhysics
Messages
104
Reaction score
0

Homework Statement


Let X and Y be independent random variables. Prove that g(X) and h(Y) are also independent where g and h are functions.


Homework Equations


I did some research and somehow stumbled upon how
E(XY) = E(X)E(Y)
is important in the proof.

f(x,y) = f(x)f(y)
F(x,y) = F(x)F(y)

The Attempt at a Solution


I am seriously stuck on this - I do not even know where to begin. I know there was a previous thread (around 7 years old) on this but I did not want to revive an old thread, and there wasn't much response in that thread.

I attempted to determine F(g(x),h(y)) by integration but then I would need f(g(x),h(y)), which (to my knowledge), is unknown unless I use a really messy transformation equation (and the question was stated prior to learning about transformations).

Expectations appear to be the way to go, since they do not require the actual density/probability function of the function of the random variable; however, I do not know of any property of expectations allowing us to deduce that two random variables are independent (since a covariance of 0 does not necessarily imply independence).

If anyone could provide me with somewhere to start, that would be much appreciated.
 
Physics news on Phys.org
Two random variables X and Y defined on the same sample space are independent if for for subsets A and B of real numbers

P(X ε A, Y ε B) = P(X ε A)P(Y ε B)

You need to prove that for f(X) and g(Y)

Start like this:

P(f(X) ε A, g(Y) ε B) = P(X ε f-1(A), P(Y ε g-1(B))

and see if you can get it from there.
 
Last edited:
Wouldn't we be assuming that f and g are invertible? Or does that have no bearing on the proof? (I always got stuck because I thought we couldn't apply f or g inverse since we would have to then assume that f and g are invertible functions.)
 
SpringPhysics said:
Wouldn't we be assuming that f and g are invertible? Or does that have no bearing on the proof? (I always got stuck because I thought we couldn't apply f or g inverse since we would have to then assume that f and g are invertible functions.)

f and g need not be invertible. f^(-1)(A) is standard notation for the set {w:f(w) \in A}.

Anyway, another approach is to look at the multivariate characteristic function: two random variables Y1 and Y2 are independent iff their joint characteristic function factors: that is, Eexp(i*k1*Y1 + i*k2*Y2) = [E(exp(i*k1*Y1))]*[E(exp(i*k2*Y2))] for all (k1,k2),
where i = sqrt(-1).

RGV
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
1
Views
2K
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K