MHB Are components in a gaussian mixture independent?

Click For Summary
The discussion revolves around the use of the Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of Gaussians. The user seeks to square the pdf but is unsure if the components are independent, which would allow for simplification. They reference the condition for independence in joint distributions and express uncertainty about determining the joint distribution for their normal components. A response suggests testing for independence between continuous variables and provides a link to useful resources. The conversation highlights the challenge of proving independence in Gaussian mixtures.
ConfusedCat
Messages
3
Reaction score
0
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers
 
Physics news on Phys.org
ConfusedCat said:
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers

Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.
 
I like Serena said:
Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.

Thank you for that link - it does look very useful.
 
Hello, I'm joining this forum to ask two questions which have nagged me for some time. They both are presumed obvious, yet don't make sense to me. Nobody will explain their positions, which is...uh...aka science. I also have a thread for the other question. But this one involves probability, known as the Monty Hall Problem. Please see any number of YouTube videos on this for an explanation, I'll leave it to them to explain it. I question the predicate of all those who answer this...

Similar threads

Replies
30
Views
4K
Replies
9
Views
4K
Replies
2
Views
2K
Replies
2
Views
2K
Replies
7
Views
6K
Replies
1
Views
4K
Replies
4
Views
1K