Question about sample covariance matrix

Click For Summary
SUMMARY

The discussion focuses on the derivation of the sample covariance matrix \(\hat{Ʃ}\) for independent random variable vectors \(X_1, X_2, \ldots, X_n\). The formula used is \(\hat{Ʃ} = \frac{1}{N}\sum_{i=1}^{N} {(X_i - \hat{M})(X_i - \hat{M})^T}\), where \(\hat{M}\) is the sample mean. The user queries the validity of the equality in the transformation of the covariance matrix, particularly regarding the vanishing of certain terms and the sign change of \((\hat{M} - M)(\hat{M} - M)^T\). The discussion highlights the importance of understanding linear transformations and expectations in covariance calculations.

PREREQUISITES
  • Understanding of sample covariance matrices
  • Familiarity with linear transformations in vector spaces
  • Knowledge of statistical expectations and their properties
  • Proficiency in matrix operations and notation
NEXT STEPS
  • Study the properties of covariance matrices in multivariate statistics
  • Learn about linear transformations and their effects on matrix operations
  • Explore the concept of expectations in probability theory
  • Investigate the implications of independence in random variables on covariance
USEFUL FOR

Statisticians, data scientists, and researchers involved in multivariate analysis, particularly those working with covariance matrices and linear algebra in statistical contexts.

sanctifier
Messages
58
Reaction score
0
Suppose vectors X1, X2, ... , Xn whose components are random variables are mutually independent(I mean Xi's are vectors of components with constants which are possible values of random variables labeled by the component indice, and all these labeled random variables are organized as a vector X, hence Xi's just are samples of such a X), and the sample mean is \hat{M} = \frac{1}{N}\sum \stackrel{N}{i = 1} Xi, and the true mean of all Xi's is M. Then to estimate the covariance matrix of Xi, we employ the following formula:
\hat{Ʃ} = \frac{1}{N}\sum \stackrel{N}{i = 1} {(Xi - \hat{M})(Xi - \hat{M})T}
\\\:= \frac{1}{N}\sum \stackrel{N}{i = 1} {((Xi - M) - (\hat{M} - M))((Xi - M) - (\hat{M} - M))T}
\\\:= \frac{1}{N}\sum \stackrel{N}{i = 1} (Xi - M)(Xi - M)T - (\hat{M} - M)(\hat{M} - M)T
My question is how does the equal sign hold in the last step?
I did some work about this question, first I note that the transpose is a llinear transformation, i.e., for two vectors V and U, (V + U)T = VT + UT, then I realize that the following equation is legal.
(V - U)(V - U)T = V\!VT - VUT - UVT + UUT
Let V = (Xi - M) and U = (\hat{M} - M), the terms missing in the last step of \hat{Ʃ} are -VUT and -UVT, OK, I know the entries of E[VUT] actually are covariances of Xi and \hat{M}, and I assume they are all zero, consequently the terms -VUT and -UVT do miss because of taking the expectation on \hat{Ʃ}, but in the last step, they vanished before taking the expectation! Why?
Finally, I also notice that the sign of UUT = (\hat{M} - M)(\hat{M} - M)T has been changed from + to -, how does this happen?
 
Last edited:
Physics news on Phys.org
Ok, if 1/N can be envisaged as a approximate probability of each entry of VUT, this can explain the vanishing of -VUT and -UVT without taking a expectation, but how to explain the sign change of UUT occurred in the last step?
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
971
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
971