Are components in a gaussian mixture independent?

Click For Summary
SUMMARY

The discussion centers on the independence of components in a Gaussian mixture model when using the Expectation Maximization (EM) algorithm to approximate a probability density function (pdf). The user seeks to determine if the normal distributions within the mixture can be treated as independent to simplify the squaring of the pdf. The independence can be assessed using joint distribution properties, specifically verifying if f(x,y) equals f(x)f(y). The conversation highlights the need for methods to test the independence of continuous variables.

PREREQUISITES
  • Expectation Maximization algorithm
  • Gaussian mixture models
  • Joint distribution concepts
  • Statistical independence testing for continuous variables
NEXT STEPS
  • Research methods for testing independence in continuous variables
  • Explore joint distribution calculations for Gaussian mixtures
  • Learn about the properties of Gaussian distributions
  • Investigate statistical techniques for approximating probability density functions
USEFUL FOR

Data scientists, statisticians, and machine learning practitioners interested in Gaussian mixture models and statistical independence testing will benefit from this discussion.

ConfusedCat
Messages
3
Reaction score
0
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers
 
Physics news on Phys.org
ConfusedCat said:
Hello all,

I have used Expectation Maximization algorithm to approximate a probability density function (pdf) using a mixture of gaussians.

I need to square the pdf corresponding to the mixture of gaussians( it is a weighted sum of normal distributions with non-identical parameters). Rather than square a sum of gaussians, I thought that if the constituent normal pdfs of the mixture were independent, they could be summed resulting in a single gaussian, which I could then square.

I don't know how to prove (or disprove) the independence of the components. For a regular joint distribution f(x,y), independence can be proved if f(x,y)=f(x)f(y). Here, I have a series of real values for each of the normal components in the distribution. How do I find the joint distribution?

Any thoughts would be welcome.

Cheers

Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.
 
I like Serena said:
Hi ConfusedCat! Welcome to MHB! ;)

We would need to test for independence of 2 continuous variables.
For instance this link gives a couple of possible approaches.

Thank you for that link - it does look very useful.
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 7 ·
Replies
7
Views
6K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K