Conditional normal distribution

Click For Summary
SUMMARY

The discussion centers on the properties of conditional normal distributions involving two dependent random variables, X and Y. Participants assert that if the marginal distributions P(X) and P(Y), along with the conditional distribution P(Y|X), are all normal, then P(X|Y) must also be normal. This conclusion is supported by the symmetry in the equations governing these distributions, specifically the relationships defined by the joint probability density functions. However, some participants call for formal proof to validate this assertion, highlighting the need for rigorous justification in statistical claims.

PREREQUISITES
  • Understanding of normal distributions and their properties
  • Familiarity with conditional probability and joint distributions
  • Knowledge of probability density functions (PDFs)
  • Basic concepts of statistical independence and dependence
NEXT STEPS
  • Study the properties of bivariate normal distributions
  • Learn about the derivation of conditional distributions in probability theory
  • Explore the concept of symmetry in statistical distributions
  • Investigate formal proofs of conditional distribution properties in statistics
USEFUL FOR

Statisticians, data scientists, and researchers involved in probability theory and statistical modeling, particularly those working with conditional distributions and normality assumptions.

learner928
Messages
21
Reaction score
0
Assume two random variables X and Y are not independent,

if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?

thanks.
 
Physics news on Phys.org
simon11 said:
Assume two random variables X and Y are not independent,

if P(X), P(Y) and P(Y|X) are all normal, then does P(X|Y) also can only be normal or not necessarily?

thanks.


Because is $\displaystyle f_{Y} (y|X=x) = \frac{f_{X,Y} (x,y)}{f_{X} (x)}$ and $\displaystyle f_{X} (x|Y=y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ if $f_{X}$, $f_{Y}$ and $f_{Y} (y|X=x)$ are all normal, for symmetry also $f_{X} (x|Y=y)$ is normal... Kind regards $\chi$ $\sigma$
 
chisigma said:
Because is $\displaystyle f_{Y} (y|X=x) = \frac{f_{X,Y} (x,y)}{f_{X} (x)}$ and $\displaystyle f_{X} (x|Y=y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ if $f_{X}$, $f_{Y}$ and $f_{Y} (y|X=x)$ are all normal, for symmetry also $f_{X} (x|Y=y)$ is normal... Kind regards $\chi$ $\sigma$
Thank you very much, but what do you by symmetry and why does it need to be symmetrical?

Thanks
 
simon11 said:
Thank you very much, but what do you by symmetry and why does it need to be symmetrical?

Thanks

'Symmetry' means that if $f_{X}(x)$, $f_{Y}(y)$ and $f_{Y}(y|X=x)$ are gaussian, then $\displaystyle f_{X}(x|Y=y)= f_{Y}(y|X=x)\ \frac{f_{X}(x)}{f_{Y}(y)}$ is also gaussian... Kind regards $\chi$ $\sigma$
 
chisigma said:
'Symmetry' means that if $f_{X}(x)$, $f_{Y}(y)$ and $f_{Y}(y|X=x)$ are gaussian, then $\displaystyle f_{X}(x|Y=y)= f_{Y}(y|X=x)\ \frac{f_{X}(x)}{f_{Y}(y)}$ is also gaussian... Kind regards $\chi$ $\sigma$
oh i didn't know that thanks.

I only knew $\displaystyle frac{f_{X}(x)}{f_{Y}(y)}$ is a cauchy distribution,
so the product of a cauchy distribution and a normal distribution has to get you back to a normal distribution?

 
chisigma said:
Because is $\displaystyle f_{Y} (y|X=x) = \frac{f_{X,Y} (x,y)}{f_{X} (x)}$ and $\displaystyle f_{X} (x|Y=y) = \frac{f_{X,Y} (x,y)}{f_{Y}(y)}$ if $f_{X}$, $f_{Y}$ and $f_{Y} (y|X=x)$ are all normal, for symmetry also $f_{X} (x|Y=y)$ is normal... Kind regards $\chi$ $\sigma$

You seem to be assuming rather more than I am happy with. You will need to prove, or give a reference to the proof, that conditional probability of Y given X, and that the marginals of X and Y are normal implies that the conditional probability of X given Y is normal.

It may be true, I can't find a counter example.

CB
 
Last edited:
chisigma said:
'Symmetry' means that if $f_{X}(x)$, $f_{Y}(y)$ and $f_{Y}(y|X=x)$ are gaussian, then $\displaystyle f_{X}(x|Y=y)= f_{Y}(y|X=x)\ \frac{f_{X}(x)}{f_{Y}(y)}$ is also gaussian... Kind regards $\chi$ $\sigma$

May be that, as required from 'regulations', a formal proof of that has to be supplied. Very well!... if $f_{X}(x)$, $f_{Y}(y)$ and $f_{Y}(y|X=x)$ are gaussian that means that is... $\displaystyle f_{X}(x)= \frac{1}{\sigma_{x}\ \sqrt{2\ \pi}}\ e^{- \frac{(x-\mu_{x})^{2}}{2\ \sigma^{2}_{x}}}$ (1)

$\displaystyle f_{Y}(y)= \frac{1}{\sigma_{y}\ \sqrt{2\ \pi}}\ e^{- \frac{(y-\mu_{y})^{2}}{2\ \sigma^{2}_{y}}}$ (2)

$\displaystyle f_{X}(x|Y=y)= A\ e^{[a\ (x-\mu_{x})^{2} + 2\ b\ (x-\mu_{x})\ (y-\mu_{y}) + c\ (x-\mu_{x})^{2}]}$ (3)

Now the p.d.f. of Y conditioned by X is...

$\displaystyle f_{X}(x|Y=y)= f_{Y}(y|X=x)\ \frac{f_{X}(x)}{f_{Y}(y)}$ (4)

... and clearly it is similar to (3), i.e. is gaussian... Kind regards $\chi$ $\sigma$
 
chisigma said:
May be that, as required from 'regulations', a formal proof of that has to be supplied. Very well!... if $f_{X}(x)$, $f_{Y}(y)$ and $f_{Y}(y|X=x)$ are gaussian that means that is... $\displaystyle f_{X}(x)= \frac{1}{\sigma_{x}\ \sqrt{2\ \pi}}\ e^{- \frac{(x-\mu_{x})^{2}}{2\ \sigma^{2}_{x}}}$ (1)

$\displaystyle f_{Y}(y)= \frac{1}{\sigma_{y}\ \sqrt{2\ \pi}}\ e^{- \frac{(y-\mu_{y})^{2}}{2\ \sigma^{2}_{y}}}$ (2)

$\displaystyle f_{X}(x|Y=y)= A\ e^{[a\ (x-\mu_{x})^{2} + 2\ b\ (x-\mu_{x})\ (y-\mu_{y}) + c\ (x-\mu_{x})^{2}]}$ (3)

Now the p.d.f. of Y conditioned by X is...

$\displaystyle f_{X}(x|Y=y)= f_{Y}(y|X=x)\ \frac{f_{X}(x)}{f_{Y}(y)}$ (4)

... and clearly it is similar to (3), i.e. is gaussian... Kind regards $\chi$ $\sigma$

(3) is wrong, it should contain the conditional mean and variance of x (conditioned on y) which may be functions of y.

\[ f_{X|Y=y}(x) = \frac{1}{\sqrt{2 \pi}\sigma_{X|Y=y}} e^{-(x-\mu_{X|Y=y})^2/(2\sigma_{X|Y=y}^2)} \]

CB
 
Last edited:
CaptainBlack said:
(3) is wrong, it should contain the conditional mean and variance of x (conditioned on y) which may be functions of y.

\[ f_{X|Y=y}(x) = \frac{1}{\sqrt{2 \pi}\sigma_{X|Y=y}} e^{-(x-\mu_{X|Y=y})^2/(2\sigma_{X|Y=y}^2)} \]

CB

One hypothesis is that $\displaystyle f_{X}(x|Y=y)$ is normal so that it can be in any case written as...

$\displaystyle f_{X} (x|Y=y) = A\ e^{[a\ (x-x_{0})^{2} + b\ (x-x_{0})\ (y-y_{0}) + c\ (y-y_{0})^{2}]}$ (1)

... where $A$, $a$, $b$, $c$, $x_{0}$ and $y_{0}$ are constants...Kind regards$\chi$ $\sigma$
 
  • #10
chisigma said:
One hypothesis is that $\displaystyle f_{X}(x|Y=y)$ is normal so that it can be in any case written as...

$\displaystyle f_{X} (x|Y=y) = A\ e^{[a\ (x-x_{0})^{2} + b\ (x-x_{0})\ (y-y_{0}) + c\ (y-y_{0})^{2}]}$ (1)

... where $A$, $a$, $b$, $c$, $x_{0}$ and $y_{0}$ are constants...Kind regards$\chi$ $\sigma$

But that is not forced by the conditions in the problem, or rather it may be but must be demonstrated if it is (without evidence I don't believe it is forced). The normality of the conditional distribution of X does not entail such a result in itself.

CB
 
  • #11
CaptainBlack said:
But that is not forced by the conditions in the problem, or rather it may be but must be demonstrated if it is (without evidence I don't believe it is forced). The normality of the conditional distribution of X does not entail such a result in itself.

CB

In the original post one hypothesis extablishes that $\displaystyle \varphi(x,y)=f_{X} (x|Y=y)$ is normal in (x,y)... well!... what does it mean that a $\displaystyle \varphi(x,y)$ is normal in (x,y)?...

Kind regards

$\chi$ $\sigma$
 
  • #12
chisigma said:
In the original post one hypothesis extablishes that $\displaystyle \varphi(x,y)=f_{X} (x|Y=y)$ is normal in (x,y)... well!... what does it mean that a $\displaystyle \varphi(x,y)$ is normal in (x,y)?...

Kind regards

$\chi$ $\sigma$

Already answered in post #8

CB
 
  • #14
chisigma said:
It seems that 'Monster Wolfram' doesn't agree with You...

Bivariate Normal Distribution -- from Wolfram MathWorld

Kind regards

$\chi$ $\sigma$

We don't necessarily have a bi-variate normal distribution. That is the mistake you have been making all along, assuming that we do, it would have to be proven.CB
 
Last edited:
  • #15
View attachment 419

Durante degli Alighieri, better known as Dante, (1 june 1265 - 13 sepetember 1321) was an Italian Florentine poet. His greatest work, La divina commedia ( The Divine Comedy) is considered as one of the greatest literary statements produced in Europe in the medieval period and it is the basis of the modern Italian language. From Divine Comedy, Purgatorio, Canto III, line 78... '... che' perder tempo a chi piu' sa piu' spiace...''... because the more you know, the less you like wasting time...'

I'm sure that all friends of MHB now undestand why I decide to stop this boring and time wasting discussion...Kind regards$\chi$ $\sigma$
 

Attachments

  • Dante Alighieri.jpg
    Dante Alighieri.jpg
    9.8 KB · Views: 112
  • #16
chisigma said:
I'm sure that all friends of MHB now undestand why I decide to stop this boring and time wasting discussion...Kind regards$\chi$ $\sigma$

That would be because you can't justify your claim!

CB
 
  • #17
A definite and [I hope...] fully exhaustive answers to the problem proposed by simon11 can be found in...

http://www.mathhelpboards.com/f19/some-considerations-about-conditional-normal-distribution-2189/

... a thread I opened at this scope on request of the Administration. It is fully evident that the level of discussion exceeds the knowledge required in the 'Basic Probability and Statistic forum' but of course that is a minor problem respect to give correct and appropriate answers in any case...Now I suggest to close this discussion with a smile spending some word about the 'infamous and insulting' verse...

'... che' perder tempo a chi piu' sa piu' spiace...'

This sentence is incised on a tablet on the wall of porter's lodge of 'Collegio Ghislieri' of Pavia...

Ghislieri College - Wikipedia, the free encyclopedia

... of which I have been 'schoolboy' in the distant years of university. The pitcures below show the tablet and myself under the tablet in a meeting of some years ago. The tablet has a long story because in the centuries has be lost and regained several time in competition with other Colleges of Pavia. In some sense the sentence is in DNA of the 'Ghislerians'... Kind regards $\chi$ $\sigma$

https://www.physicsforums.com/attachments/421._xfImport https://www.physicsforums.com/attachments/422._xfImport
 

Attachments

  • Perdertempo[4].jpg
    Perdertempo[4].jpg
    15.2 KB · Views: 82
  • Perdertempo[3].JPG
    Perdertempo[3].JPG
    14.3 KB · Views: 85

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K