Hello, all.(adsbygoogle = window.adsbygoogle || []).push({});

I'm wondering about matrix-variate normal distributions. I know they normally assume an n x p random matrix, X, and associated row and column covariance matrices Omega and Sigma, but I'm wondering how the probability density function changes if X is comprised of a square, symmetric matrix that still has covariances among the rows and columns.

Can it be shown that even given these covariances, the matrix can be vectorized without loss of information?

Is it the case for this constrained example that the row and column covariance matrices are equal, and so the determinant in the denominator of the right-hand side term outside the exponential is multiplied by itself?

I'm interested in maximum likelihood estimation under this simplified example.

Thanks!

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Constraints on matrix-variate normal distributions

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**