# Trace of a matrix and an expansion of eigenvectors

1. Sep 16, 2009

### CuppoJava

Hi,
I'm trying to derive the Kullback-Leibler divergence between two multi-variate gaussian distributions, and I need the following property. Is there a simple way to understand this?

Prove that:
Given that E has orthonormal eigenvectors $$u_{i}$$ and eigenvalues $$\lambda_{i}$$

Then:
$$trace(A*E) = \sum_{i}u^{T}_{i}*A*u_{i}*\lambda_{i}$$

I'm not quite sure how to start. I suspected that it can be proven by looking at block matrices but I didn't get anywhere with that. Thanks a lot for your help.
-Patrick

2. Sep 17, 2009

### tiny-tim

Hi Patrick!

(have a sigma: ∑ and a lambda: λ and try using the X2 tag just above the Reply box )

If the ui are orthonormal, then they can be used as a basis, and in that basis tr(AE) = ∑ λiAii

Then transform back to a general basis, and you get the result given.

(there's probably a more direct way of doing it, also! )

3. Sep 17, 2009

### CuppoJava

Thanks Tiny_Tim. It took me a while to figure out the details but I finally got it. Your advice worked perfectly!

PS: And thanks for the generous greek letters. =)