MHB Are components in a gaussian mixture independent?

Click For Summary
The discussion revolves around the use of the Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of Gaussians. The user seeks to square the pdf but is unsure if the components are independent, which would allow for simplification. They reference the condition for independence in joint distributions and express uncertainty about determining the joint distribution for their normal components. A response suggests testing for independence between continuous variables and provides a link to useful resources. The conversation highlights the challenge of proving independence in Gaussian mixtures.
ConfusedCat
Messages
3
Reaction score
0
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers
 
Physics news on Phys.org
ConfusedCat said:
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers

Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.
 
I like Serena said:
Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.

Thank you for that link - it does look very useful.
 
First trick I learned this one a long time ago and have used it to entertain and amuse young kids. Ask your friend to write down a three-digit number without showing it to you. Then ask him or her to rearrange the digits to form a new three-digit number. After that, write whichever is the larger number above the other number, and then subtract the smaller from the larger, making sure that you don't see any of the numbers. Then ask the young "victim" to tell you any two of the digits of the...

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 7 ·
Replies
7
Views
6K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K