On the definition of Von Neumann entropy

1. Jul 12, 2009

univector

I am confused by the definition of the Von Neumann entropy. In Nielson and Chung's book page 510, the Von Neumann entropy is defined as
$$S (\rho) = - tr(\rho \log \rho)$$
where $$\rho$$ is the density matrix. What is the definition of the logrithm of a matrix? Is it some series expansion of a matrix, or an element-by-element logrithm?

Thanks.

2. Jul 12, 2009

Fredrik

Staff Emeritus
Note that

$$\frac{d}{dx}\log(1+x)=\frac{1}{1+x}=1-x+x^2-x^3+\cdots$$

Integrate.

$$\log(1+x)=x-\frac{x^2}{2}+\frac{x^3}{3}-\frac{x^4}{4}+\cdots$$

Now set y=1+x.

$$\log y=(y-1)-\frac{(y-1)^2}{2}+\cdots=\sum_{k=1}^\infty(-1)^{k+1}\frac{(y-1)^k}{k}$$

This suggests that if A is a matrix, we can define log A by

$$\log A=\sum_{k=1}^\infty(-1)^{k+1}\frac{(A-I)^k}{k}$$

for all matrices A such that the series converges. More information here.

Last edited: Jul 12, 2009
3. Jul 13, 2009

univector

Hi Fredrik,

Thanks for the explanation. If the matrix $\rho$ is not diagonal, it is not trivial to calculate the matrix polynomial series. Instead, if we do a similary transform to diagonize $\rho$ first, things may become easier. Let the diagonal matrix be $D$. Then by using the series expansion forward (for a matrix) and backward (for a number), we can come up with
$$S(\rho) = tr (D M)$$,
where
$$M = \left( \begin{array}{cccc} \log \lambda_1 & 0 & 0 & ... \\ 0 & \log \lambda_2 & 0 & ... \\ ... & ... & ... & ... \\ 0 & ... & 0 & \log \lambda_n \end{array} \right)$$
where $\lambda_i$ is the ith engenvalue of matrix $\rho$ (or $D$), and $n$ is the number of rows (also columns) of $\rho$ .

Is this the way people calculate $S(\rho)$?

4. Jul 13, 2009

Yes.