Are Random Variables X and Y Related If Their Conditional Probabilities Change?

  • Context: Graduate 
  • Thread starter Thread starter Kuma
  • Start date Start date
  • Tags Tags
    Conditional Proof
Click For Summary
SUMMARY

Two random variables (R.V.s) X and Y are related if their joint probability density function (pdf) changes with respect to each other. Specifically, if f(X=x1|Y=y1) differs from f(X=x1|Y=y2), then X and Y are dependent. Conversely, if f(Y=y1|X=x1) remains constant for different values of X, then X and Y are independent. The discussion establishes that the relationship between X and Y is symmetric, meaning if X is related to Y, then Y is inherently related to X as well.

PREREQUISITES
  • Understanding of joint probability density functions (pdfs)
  • Familiarity with conditional probabilities
  • Knowledge of discrete probability distributions
  • Basic concepts of independence and dependence in probability theory
NEXT STEPS
  • Study the properties of joint probability density functions in detail
  • Learn about the implications of conditional independence in probability theory
  • Explore the concept of symmetric relationships in random variables
  • Investigate the differences between discrete and continuous probability distributions
USEFUL FOR

Students and professionals in statistics, data science, and mathematics who are exploring the relationships between random variables and their implications in probability theory.

Kuma
Messages
129
Reaction score
0
1) I'm trying to prove that two R.V.s X & Y are related iff Y & X are related. Assuming they are discretely distributed.

So basically from what I've learned is that two R.V.s are related if the joint pdf changes as Y changes. So basically if f(X|Y=yi) changes when i changes. So from that definition this is what I came up with.

if I have 2 pdf's and assuming X and Y are related then

f(X=x1|Y=y1) = P(X=x1 n Y=y1)/P(Y=y1)

should not be the same as:

f(X=x1|Y=y2) = P(X=x1 n Y=y2)/P(Y=y2)

However if Y and X are not related then:

f(Y=y1|X=x1) = P(Y=y1 n X=x1)/P(X=x1)

should be the same as:

f(Y=y1|X=x2) = P(Y=y1 n X=x2)/P(X=x2)

But P(X=x1 n Y=y1) = P(Y=y1 n X=x1) thus:

f(Y=y1|X=x1)*P(X=x1)/P(Y=y1) = f(X=x1|Y=y1)

So we can see that f(X=x1|Y=y1) depends on f(Y=y1|X=x1), so if X and Y are related, it should mean that Y and X are related as well? Since f(Y=y1|X=x1) = f(Y=y1|X=x2), I can put in any f(Y|X=xi) in there, and the left side should remain unchanged, but it is contradictory to the right side since it has to change if X and Y are related. I'm not sure if that's the right way to prove it.
 
Physics news on Phys.org
Random variables X and Y are independent means P(X<x and Y<y) = P(X<x)P(Y<y) for all x and y. If this relationship does not hold, they are dependent (you call related). The equation is symmetric in X and Y.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
17
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
11
Views
10K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K