Decomposition of a R.V. into dependent and independent parts

Click For Summary

Discussion Overview

The discussion revolves around the decomposition of random variables into dependent and independent components, specifically exploring whether it is possible to express a dependent random variable Y in terms of another random variable X and an independent random variable W. The scope includes theoretical considerations and mathematical reasoning regarding joint distributions and conditional probabilities.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants propose that it is possible to find a random variable W independent of X such that Y can be expressed as Y = f(X, W) for some function f, given certain conditions on the distributions.
  • One participant provides a proof involving the cumulative distribution function of Y conditional on X, suggesting that under continuity conditions, a uniform random variable U can be used to express Y in terms of X and U.
  • Another participant questions the interpretation of the equality in the expression Y = f(X, W), arguing that while it is possible to find W and f such that they match distributions, achieving "almost surely equal" outcomes is generally not feasible without dependence on X.
  • A further contribution outlines a method for the discrete case, defining W based on the conditional probabilities of Y given X, and demonstrating that this construction aligns with the distribution of Y.

Areas of Agreement / Disagreement

Participants express differing views on the feasibility of achieving "almost surely equal" outcomes with independent random variables, indicating a lack of consensus on this aspect of the discussion. While some agree on the possibility of matching distributions, the conditions under which this can be done remain contested.

Contextual Notes

Limitations include assumptions about the continuity of the cumulative distribution function and the definitions of independence and equality in the context of random variables. The discussion does not resolve these complexities.

mXSCNT
Messages
310
Reaction score
1
Given random variables X and Y, which are not independent, is it always possible to find a random variable W which is independent from X, such that Y = f(X,W), for some function f?

Example: let the joint distribution of X and Y be
Code:
 Y  0  1
X+-------
0|1/3 1/6
1|1/6 1/3
Then if we let W have the Bernoulli distribution with p = 1/3, and f(x,w) = x XOR w, we have
y = f(X,W).
 
Physics news on Phys.org
yes. You can prove it as follows.

Define,
g_X(y) = P(Y<=y|X)
which gives the cumulative distribution function of Y conditional on X.
Define the variable
U=g_X(Y)
As long as g_X is continuous (so Y has continuous distribution) then U will be uniformly distributed on [0,1] independently of X.

Setting f(x,u)=g_X^{-1}(u) gives Y=f(X,U).

If g_X is not continuous then you can still set Y=f(X,U) for a r.v. U uniform on [0,1] independently of X, and X,Y will have the correct joint distribution. Note that this can require enlarging the probability space in order to introduce U.

Hopefully you can fill in the gaps in that brief proof.
 
mXSCNT said:
Given random variables X and Y, which are not independent, is it always possible to find a random variable W which is independent from X, such that Y = f(X,W), for some function f?

Depends on what you mean by "=" in the above equation. You can definitely find such a W and f such that f(X,W) has the same distribution as Y (in fact, you don't even need an X here at all), or even such that (X,Y) and (X,f(X,W)) have the same joint distribution.

But if by "=" you mean "almost surely equal," then, no, such a result would generally have to depend on X. I.e., you'd use W = Y-X, and f(X,W) would be X+W. But that is clearly NOT independent of X. Simply put, there's generally no way that you can replace the outcome of a random variable with the outcome of an independent random variable, and hope to get the same answer all of the time.
 
For the discrete case: Let A be the domain of X and B be the domain of Y. Let W be a random variable defined on BA, where P(W[x] = y) = P(Y=y | X=x). Let f(x,w) = w[x]. Then P(f(X,W) = y) = P(W[X] = y) = P(Y=y | X), which is the distribution of Y.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K