MHB Some considerations about conditional normal distribution....

chisigma
Gold Member
MHB
Messages
1,627
Reaction score
0
Scope of this thread is to give a complete as possible answer to the question proposed two days ago by the user simon11 on Basic Probability and Statistic forum...

Assume two random variables X and Y are not independent, if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?...

My 'almost automatic' answer has been 'yes!... P(X|Y) must necessarly be a normal distribution too...', but other members of MHB expressed critics or doubts about that, so I intend to clarify all the aspects not enough clear of the problem. The first step to perform the task is to remember the definition of conditional distribution function. According to...

Conditional probability distribution - Wikipedia, the free encyclopedia

... if the r.v. X has p.d.f. $f_{X} (x)$, the r.v. Y has p.d.f. $f_{Y}(y)$, and X and Y have joint density function $f_{X,Y} (x.y)$, then the conditional probability distribution functions of X and Y, one conditioned by the other, are...

$\displaystyle f_{Y} (y|X=x) = f_{Y|X} (x,y) = \frac{f_{X,Y} (x,y)}{f_{X}(x)}$ (1)

$\displaystyle f_{X} (x|Y=y) = f_{X|Y} (x,y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ (2)

Very well!... now the basic definitions (1) and (2) give the answer to the question of simon11... why?... observing (1) and (2) it is fully evident their intrinsic symmetry respect to the X and Y, so that is always possible to swap the roles of X and Y and if $f_{X}$,$f_{Y}$ and $f_{Y|X}$ have the same property, no matter which is the property, also $f_{X|Y}$ has that property. After some marginal clarifications, simon11 seems to have been satisfied by the answer. One member of the staff of MHB however wasn’t and required a ‘formal proof’. Well!... in order to do that the first step is to find, under the assumption that X and Y are normal r.v., the explicit expressions of $f_{X}$,$f_{Y}$ and $f_{X,Y}$ and use (1) and (2) to obtain $f_{X|Y}$ and $f_{Y|X}$. No matter of course for the first two...

$\displaystyle f_{X}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}}\ e^{- \frac{(x-\mu_{X})^{2}}{2\ \sigma^{2}_{X}}}$ (3)

$\displaystyle f_{Y}(y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}}\ e^{- \frac{(y-\mu_{Y})^{2}}{2\ \sigma^{2}_{Y}}}$ (4)

... but how to say about $f_{X,Y}$?... 'Monster Wolfram' helps us...

Bivariate Normal Distribution -- from Wolfram MathWorld

$\displaystyle f_{X,Y} (x,y)= \frac{1}{2\ \pi\ \sigma_{X}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}$ (5)

... where...

$\displaystyle z= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} - 2\ \frac{\rho\ (x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (6)

$\displaystyle \rho= \text{cor}\ (X,Y)= \frac{V_{X,Y}}{\sigma_{X}\ \sigma_{Y}}$ (7)

Usually $\rho$ is called 'correlation' of X and Y and $V_{X,Y}$ is called 'covariance' of X and Y. The (6) and (7) are very interesting and 'suggestive' because the presence of the term $\rho$. In X and Y independent [or more precisely unrelated...], then $\rho=0$, if not [and that is the case proposed by simon11...] an 'extra term' must be taken into account. Now we are able, using (1) and (2), to compute $f_{Y|X} (x,y)$ and $f_{X|Y} (x,y)$ with a symple division...

$\displaystyle f_{Y|X} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{u}{2\ (1-\rho^{2})}}$ (8)

... where...

$\displaystyle u= \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}} -2\ \rho\ \frac{(y-\mu_{Y})\ (x-\mu_{X})}{\sigma_{Y}\ \sigma_{X}} + \rho^{2}\ \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}}$ (9)

$\displaystyle f_{X|Y} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{v}{2\ (1-\rho^{2})}}$ (10)

... where...

$\displaystyle v= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} -2\ \rho\ \frac{(x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \rho^{2}\ \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (11)

So 'finally' we are arrived to an explicit expression for $f_{Y|X}$ and $f_{X|Y}$ in the general case where X and Y are not independent. A I said before $f_{Y|X}$ is obtained from $f_{X|Y}$ swapping the role of X and Y... of course!... now by integration one can compute, if desired, $\mu_{Y|X}$, $\mu_{X|Y}$,$\sigma^{2}_{Y|X}$, $\sigma^{2}_{X|Y}$ and other interesting parameters... now I'm a little tired and that will be made, in case, in a successive post... Kind regards $\chi$ $\sigma$
 
Last edited:
Physics news on Phys.org
chisigma said:
Scope of this thread is to give a complete as possible answer to the question proposed two days ago by the user simon11 on Basic Probability and Statistic forum...

Assume two random variables X and Y are not independent, if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?...

My 'almost automatic' answer has been 'yes!... P(X|Y) must necessarly be a normal distribution too...', but other members of MHB expressed critics or doubts about that, so I intend to clarify all the aspects not enough clear of the problem. The first step to perform the task is to remember the definition of conditional distribution function. According to...

Conditional probability distribution - Wikipedia, the free encyclopedia

... if the r.v. X has p.d.f. $f_{X} (x)$, the r.v. Y has p.d.f. $f_{Y}(y)$, and X and Y have joint density function $f_{X,Y} (x.y)$, then the conditional probability distribution functions of X and Y, one conditioned by the other, are...

$\displaystyle f_{Y} (y|X=x) = f_{Y|X} (x,y) = \frac{f_{X,Y} (x,y)}{f_{X}(x)}$ (1)

$\displaystyle f_{X} (x|Y=y) = f_{X|Y} (x,y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ (2)

Very well!... now the basic definitions (1) and (2) give the answer to the question of simon11... why?... observing (1) and (2) it is fully evident their intrinsic symmetry respect to the X and Y, so that is always possible to swap the roles of X and Y and if $f_{X}$,$f_{Y}$ and $f_{Y|X}$ have the same property, no matter which is the property, also $f_{X|Y}$ has that property. After some marginal clarifications, simon11 seems to have been satisfied by the answer. One member of the staff of MHB however wasn’t and required a ‘formal proof’. Well!... in order to do that the first step is to find, under the assumption that X and Y are normal r.v., the explicit expressions of $f_{X}$,$f_{Y}$ and $f_{X,Y}$ and use (1) and (2) to obtain $f_{X|Y}$ and $f_{Y|X}$. No matter of course for the first two...

$\displaystyle f_{X}(x)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}}\ e^{- \frac{(x-\mu_{X})^{2}}{2\ \sigma^{2}_{X}}}$ (3)

$\displaystyle f_{Y}(y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}}\ e^{- \frac{(y-\mu_{Y})^{2}}{2\ \sigma^{2}_{Y}}}$ (4)

... but how to say about $f_{X,Y}$?... 'Monster Wolfram' helps us...

Bivariate Normal Distribution -- from Wolfram MathWorld

$\displaystyle f_{X,Y} (x,y)= \frac{1}{2\ \pi\ \sigma_{X}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{z}{2\ (1-\rho^{2})}}$ (5)

... where...

$\displaystyle z= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} - 2\ \frac{\rho\ (x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (6)

$\displaystyle \rho= \text{cor}\ (X,Y)= \frac{V_{X,Y}}{\sigma_{X}\ \sigma_{Y}}$ (7)

Usually $\rho$ is called 'correlation' of X and Y and $V_{X,Y}$ is called 'covariance' of X and Y. The (6) and (7) are very interesting and 'suggestive' because the presence of the term $\rho$. In X and Y independent [or more precisely unrelated...], then $\rho=0$, if not [and that is the case proposed by simon11...] an 'extra term' must be taken into account. Now we are able, using (1) and (2), to compute $f_{Y|X} (x,y)$ and $f_{X|Y} (x,y)$ with a symple division...

$\displaystyle f_{Y|X} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{Y}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{u}{2\ (1-\rho^{2})}}$ (8)

... where...

$\displaystyle u= \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}} -2\ \rho\ \frac{(y-\mu_{Y})\ (x-\mu_{X})}{\sigma_{Y}\ \sigma_{X}} + \rho^{2}\ \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}}$ (9)

$\displaystyle f_{X|Y} (x,y)= \frac{1}{\sqrt{2\ \pi}\ \sigma_{X}\ \sqrt{1-\rho^{2}}}\ e^{- \frac{v}{2\ (1-\rho^{2})}}$ (10)

... where...

$\displaystyle v= \frac{(x-\mu_{X})^{2}}{\sigma^{2}_{X}} -2\ \rho\ \frac{(x-\mu_{X})\ (y-\mu_{Y})}{\sigma_{X}\ \sigma_{Y}} + \rho^{2}\ \frac{(y-\mu_{Y})^{2}}{\sigma^{2}_{Y}}$ (11)

So 'finally' we are arrived to an explicit expression for $f_{Y|X}$ and $f_{X|Y}$ in the general case where X and Y are not independent. A I said before $f_{Y|X}$ is obtained from $f_{X|Y}$ swapping the role of X and Y... of course!... now by integration one can compute, if desired, $\mu_{Y|X}$, $\mu_{X|Y}$,$\sigma^{2}_{Y|X}$, $\sigma^{2}_{X|Y}$ and other interesting parameters... now I'm a little tired and that will be made, in case, in a successive post... Kind regards $\chi$ $\sigma$

You seem to be assuming that normal marginals implies joint normality. This is false.

If you could prove that both marginals normal plus one conditional normal implies that the joint distribution is normal you would be done. But this requires that the variance of the conditional be independent of the conditioning value, which we would have to justify.CB
 
Last edited:
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top