Independance of RVs with same distribution

  • Thread starter Thread starter nexp
  • Start date Start date
  • Tags Tags
    Distribution
AI Thread Summary
The discussion centers around the independence of two Gaussian random variables, X and Y, both with zero mean and unit variance. Initially, the question posed was whether P(X|Y) equals P(X), leading to an exploration of the expectation of their product E[XY]. The conclusion drawn is that if X and Y have the same distribution but are not independent, then P(X|Y) does not equal P(X), and the expectation of their product does not factorize as initially assumed. The clarification emphasizes that independence is a separate consideration from having the same distribution, with potential partial dependence in between. Understanding these distinctions enhances comprehension of the relationship between the two random variables.
nexp
Messages
2
Reaction score
0
[Solved] Independance of RVs with same distribution

Hey all,

Let's say we have two Gaussian random variables X, Y, each with zero mean and unit variance. Is it correct to say that P(X|Y) = P(X)?

In other words, suppose that we want to compute the expectation of their product \operatorname{E}[XY]. Is the following correct? I.e. does their joint distribution factorise?

E[XY] = \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}x y\: p(x) p(y)\: dx dy
= \int_{-\infty}^{\infty}x \:p(x)\: \operatorname{d} x \int_{-\infty}^{\infty} y \: p(y) \: \operatorname{d} y
= \operatorname{E}[X]\operatorname{E}[Y] \nonumber

Many Thanks.

Update

I have now figured out the answer to the above questions. I'll post it here for anyone who is interested.

If X and Y have the same distribution, then we can write P(X|X) = 1 \neq P(X).

Now looking again at expectations. From the above, we have that

E[XY]=E[X^2]
=\int_{-\infty}^{\infty}x^2 p(x^2) \: dx

similarly giving a negative answer for the expectation of the product.
 
Last edited:
Physics news on Phys.org
It looks like you are confusing two different things. X and Y are assumed to be normal with the same distribution.

However whether of not they are independent is a completely separate question. If they are independent then your analysis before the update is correct. On the other hand if X=Y, then the comment after the update is valid.

These are the two extreme possibilities (partial dependence in between).
 
Thanks very much. I understand what's going on better now.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top