Decomposition of a R.V. into dependent and independent parts

mXSCNT
Messages
310
Reaction score
1
Given random variables X and Y, which are not independent, is it always possible to find a random variable W which is independent from X, such that Y = f(X,W), for some function f?

Example: let the joint distribution of X and Y be
Code:
 Y  0  1
X+-------
0|1/3 1/6
1|1/6 1/3
Then if we let W have the Bernoulli distribution with p = 1/3, and f(x,w) = x XOR w, we have
y = f(X,W).
 
Physics news on Phys.org
yes. You can prove it as follows.

Define,
g_X(y) = P(Y<=y|X)
which gives the cumulative distribution function of Y conditional on X.
Define the variable
U=g_X(Y)
As long as g_X is continuous (so Y has continuous distribution) then U will be uniformly distributed on [0,1] independently of X.

Setting f(x,u)=g_X^{-1}(u) gives Y=f(X,U).

If g_X is not continuous then you can still set Y=f(X,U) for a r.v. U uniform on [0,1] independently of X, and X,Y will have the correct joint distribution. Note that this can require enlarging the probability space in order to introduce U.

Hopefully you can fill in the gaps in that brief proof.
 
mXSCNT said:
Given random variables X and Y, which are not independent, is it always possible to find a random variable W which is independent from X, such that Y = f(X,W), for some function f?

Depends on what you mean by "=" in the above equation. You can definitely find such a W and f such that f(X,W) has the same distribution as Y (in fact, you don't even need an X here at all), or even such that (X,Y) and (X,f(X,W)) have the same joint distribution.

But if by "=" you mean "almost surely equal," then, no, such a result would generally have to depend on X. I.e., you'd use W = Y-X, and f(X,W) would be X+W. But that is clearly NOT independent of X. Simply put, there's generally no way that you can replace the outcome of a random variable with the outcome of an independent random variable, and hope to get the same answer all of the time.
 
For the discrete case: Let A be the domain of X and B be the domain of Y. Let W be a random variable defined on BA, where P(W[x] = y) = P(Y=y | X=x). Let f(x,w) = w[x]. Then P(f(X,W) = y) = P(W[X] = y) = P(Y=y | X), which is the distribution of Y.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Replies
7
Views
1K
Replies
5
Views
2K
Replies
5
Views
2K
Replies
30
Views
4K
Replies
1
Views
2K
Replies
1
Views
2K
Back
Top