Trace of a matrix and an expansion of eigenvectors

CuppoJava
Messages
23
Reaction score
0
Hi,
I'm trying to derive the Kullback-Leibler divergence between two multi-variate gaussian distributions, and I need the following property. Is there a simple way to understand this?

Prove that:
Given that E has orthonormal eigenvectors u_{i} and eigenvalues \lambda_{i}

Then:
trace(A*E) = \sum_{i}u^{T}_{i}*A*u_{i}*\lambda_{i}

I'm not quite sure how to start. I suspected that it can be proven by looking at block matrices but I didn't get anywhere with that. Thanks a lot for your help.
-Patrick
 
Physics news on Phys.org
Hi Patrick! :smile:

(have a sigma: ∑ and a lambda: λ and try using the X2 tag just above the Reply box :wink:)

If the ui are orthonormal, then they can be used as a basis, and in that basis tr(AE) = ∑ λiAii

Then transform back to a general basis, and you get the result given.

(there's probably a more direct way of doing it, also! :wink:)
 
Thanks Tiny_Tim. It took me a while to figure out the details but I finally got it. Your advice worked perfectly!

PS: And thanks for the generous greek letters. =)
 
Back
Top