Independance of RVs with same distribution

  • Context: Graduate 
  • Thread starter Thread starter nexp
  • Start date Start date
  • Tags Tags
    Distribution
Click For Summary
SUMMARY

The discussion centers on the independence of two Gaussian random variables, X and Y, both with zero mean and unit variance. It establishes that P(X|Y) does not equal P(X) if X and Y are not independent, specifically when X equals Y. The expectation of their product, E[XY], is correctly evaluated as E[X^2] when X and Y are not independent, highlighting the distinction between independence and identical distribution. The conclusion emphasizes that independence is a separate consideration from having the same distribution.

PREREQUISITES
  • Understanding of Gaussian random variables
  • Knowledge of conditional probability
  • Familiarity with expectation and joint distributions
  • Basic concepts of independence in probability theory
NEXT STEPS
  • Study the properties of Gaussian distributions and their implications on independence
  • Learn about conditional probability and its applications in statistics
  • Explore the concept of joint distributions and their factorization
  • Investigate the implications of dependence and independence in random variables
USEFUL FOR

Statisticians, data scientists, and students of probability theory looking to deepen their understanding of random variable independence and joint distributions.

nexp
Messages
2
Reaction score
0
[Solved] independence of RVs with same distribution

Hey all,

Let's say we have two Gaussian random variables X, Y, each with zero mean and unit variance. Is it correct to say that [tex]P(X|Y) = P(X)[/tex]?

In other words, suppose that we want to compute the expectation of their product [tex]\operatorname{E}[XY][/tex]. Is the following correct? I.e. does their joint distribution factorise?

[tex]E[XY] = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}x y\: p(x) p(y)\: dx dy[/tex]
[tex]= \int_{-\infty}^{\infty}x \:p(x)\: \operatorname{d} x \int_{-\infty}^{\infty} y \: p(y) \: \operatorname{d} y[/tex]
[tex]= \operatorname{E}[X]\operatorname{E}[Y] \nonumber[/tex]

Many Thanks.

Update

I have now figured out the answer to the above questions. I'll post it here for anyone who is interested.

If X and Y have the same distribution, then we can write [tex]P(X|X) = 1 \neq P(X)[/tex].

Now looking again at expectations. From the above, we have that

[tex]E[XY]=E[X^2][/tex]
[tex]=\int_{-\infty}^{\infty}x^2 p(x^2) \: dx[/tex]

similarly giving a negative answer for the expectation of the product.
 
Last edited:
Physics news on Phys.org
It looks like you are confusing two different things. X and Y are assumed to be normal with the same distribution.

However whether of not they are independent is a completely separate question. If they are independent then your analysis before the update is correct. On the other hand if X=Y, then the comment after the update is valid.

These are the two extreme possibilities (partial dependence in between).
 
Thanks very much. I understand what's going on better now.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K