# Function of operators. Series, matrices.

1. May 26, 2014

### LagrangeEuler

Is it possible to write series $\ln x=\sum_na_nx^n$. I am asking you this because this is Taylor series around zero and $\ln 0$ is not defined.

And if $A$ is matrix is it possible to write
$\ln A=\sum_na_nA^n$. Thanks for the answer!

2. May 26, 2014

### mathman

The answer to the first question is no. This forces the second question to also have an answer no.

For ln if you want a series you can use ln(1+x) or ln(1-x) or ln((1+x)/(1-x)).

3. May 26, 2014

### LagrangeEuler

Ok. Tnx. But it is ok to write
$lnx=\sum_n a_n(x-1)^n$
right?
or
$lnA=\sum_n a_n(A-1)^n$

4. May 26, 2014

### HallsofIvy

Yes, of course. In fact it is well known that (Taylor's series)
$$ln(x)= \sum_{n=1}^\infty \frac{(-1)^n}{n}(x- 1)^n$$

5. May 27, 2014

### LagrangeEuler

Ok. Tnx. Is it true also for operators?

6. May 27, 2014

### micromass

Yes, but not for all operators. Only the ones in a certain radius of convergence. Here, that would be that it's ok for all operators with $\|A\|<1$.

For a proof, see "the structure and Geometry of Lie groups" by Hilgert and Neeb, chapter 3. Although most functional analysis texts should have this as well.

7. May 28, 2014

### LagrangeEuler

Thanks a lot. My problem is to calculate $Tr(A\ln A)$, where $A$ is matrix. Because of that I thought to write $\ln A$ as power series if that is possilble. But here I can not do it in such easy way! Right?

8. May 28, 2014

### micromass

If $\|A\|<1$, then you can do it this way. Otherwise, you will need to tell us what your definition of $\ln A$ is.

9. May 28, 2014

### DeIdeal

Is A by any chance a density matrix (ie are you calculating the von Neumann entropy)? If so, its properties guarantee that you can use its spectral decomposition to write the trace as $\mathrm{Tr} \rho \ln \rho = \sum_i \lambda_i \ln{\lambda_i}$ for the eigenvalues $\lambda_i$ of the density matrix, which is often a useful form.

10. May 28, 2014

### LagrangeEuler

When can I use spectral decomposition? Is it true only for Hermitian matrices, or not? Also what will happen in case of degenerated spectrum?

11. May 28, 2014

### DeIdeal

It exists for all normal operators, so any Hermitian matrices are fine. For a degenerate spectrum, you can use the general spectral decomposition: $A=\sum_{i=1}^{n}\sum_{d=1}^{D(i)} \lambda_i |\lambda_i,d\rangle\langle\lambda_i,d|$, where D(i) is the degeneracy of eigenvalue $\lambda_i$, so that the vectors ${|\lambda_i,d\rangle}$ form an orthonormal basis on the corresponding D(i)-dimensional eigenspace.

Note that other properties of the density matrix would guarantee that the logarithm is always well-defined: The operator norm of a bounded normal operator equals its spectral radius, and if this is indeed a density operator, its matrix elements are non-negative and $\sum_i \lambda_i = 1$. So, the condition micromass mentioned, ||A||<1, is fulfilled, unless the density matrix describes a pure state, that is, one of the eigenvalues of A equals one, in which case the von Neumann entropy vanishes.

12. May 28, 2014

### LagrangeEuler

In case of degeneracy I can not find eigenvectors in unique form. So I think that I can not find operator in unique form. For example look the case in $\mathbb{R}^3$, $\lambda_1=\lambda_2=\lambda_3=1$ and eigenvectors
$x_1^T=(1 \quad 0 \quad 0)$,$x_2^T=(0 \quad 0 \quad 0)$, $x_3^T=(0 \quad 0 \quad 0)$. How to find matrix in this case?

13. May 28, 2014

### DeIdeal

Eigenvectors cannot be zero vectors.

(In addition, if we're still considering density matrices, that's not going to be a valid density matrix since the sum of eigenvalues is greater than 1.)

14. May 28, 2014

### WWGD

15. Jun 3, 2014

### HallsofIvy

Yes, most texts specify that eigenvectors must be non-zero. But that leaves us having to say "the set of all eigenvectors of a linear transformation, together with the zero vector, form a sub-space" and "the set of all eigenvectors corresponding to a given eigenvector, together with the zero vector, form a subspace."

A few texts say "$\lambda$ is an eigenvalue of linear transformation A if and only if there is a non-zero vector, v, such that $Av= \lambda v$" and then "v is an eigenvector of A, corresponding to eigenvalue $\lambda$, if and only if $Av= \lambda v$" which does NOT require that an eigenvector be non-zero. That allows us to say simply "the set of all eigenvectors of a linear transformation form a subspace" without having to add the zero vector. A small point but I see no reason to refuse to allow the zero vector as an eigenvector.

16. Jun 5, 2014

### DeIdeal

Really? Mind giving a reference? I'm not realy skeptical, just curious, since I have never seen that convention being used in a text (either in linear algebra or in functional analysis), and this is the second time someone commented about that on a post I made. I guess the main difference is that you give up the uniqueness of the eigenvalue of a set eigenvector, although I wouldn't be surprised if more exceptions arise.

Last edited: Jun 5, 2014
17. Jun 5, 2014

### micromass

I have never seen a math text that allows zero as an eigenvector. I would like to see some references for this too.

18. Jun 5, 2014

### pasmith

In my usage, an eigenvector corresponding to an eigenvalue $\lambda$ is a non-zero $v \in V$ such that $Av = \lambda v$. The eigenspace corresponding to the eigenvalue $\lambda$ is the set of all $v \in V$ such that $Av = \lambda v$; the eigenspace is, as the name suggests, a subspace of $V$.