Function of operators. Series, matrices.

In summary: In case of degeneracy I can not find eigenvectors in unique form. So I think that I can not find operator in unique form. For example look the case in ##\mathbb{R}^3##, ##\lambda_1=\lambda_2=\lambda_3=1## and eigenvectors##x_1^T=(1 \quad 0 \quad 0)##,##x_2^T=(0 \quad 0 \quad 0)##, ##x_3^T=(0 \quad 0 \quad 0)##.
  • #1
LagrangeEuler
717
20
Is it possible to write series ##\ln x=\sum_na_nx^n##. I am asking you this because this is Taylor series around zero and ##\ln 0## is not defined.

And if ##A## is matrix is it possible to write
##\ln A=\sum_na_nA^n##. Thanks for the answer!
 
Physics news on Phys.org
  • #2
The answer to the first question is no. This forces the second question to also have an answer no.

For ln if you want a series you can use ln(1+x) or ln(1-x) or ln((1+x)/(1-x)).
 
  • #3
Ok. Tnx. But it is ok to write
##lnx=\sum_n a_n(x-1)^n ##
right?
or
##lnA=\sum_n a_n(A-1)^n ##
 
  • #4
Yes, of course. In fact it is well known that (Taylor's series)
[tex]ln(x)= \sum_{n=1}^\infty \frac{(-1)^n}{n}(x- 1)^n[/tex]
 
  • #5
Ok. Tnx. Is it true also for operators?
 
  • #6
LagrangeEuler said:
Ok. Tnx. Is it true also for operators?

Yes, but not for all operators. Only the ones in a certain radius of convergence. Here, that would be that it's ok for all operators with ##\|A\|<1##.

For a proof, see "the structure and Geometry of Lie groups" by Hilgert and Neeb, chapter 3. Although most functional analysis texts should have this as well.
 
  • #7
Thanks a lot. My problem is to calculate ##Tr(A\ln A)##, where ##A## is matrix. Because of that I thought to write ##\ln A## as power series if that is possilble. But here I can not do it in such easy way! Right?
 
  • #8
LagrangeEuler said:
Thanks a lot. My problem is to calculate ##Tr(A\ln A)##, where ##A## is matrix. Because of that I thought to write ##\ln A## as power series if that is possilble. But here I can not do it in such easy way! Right?

If ##\|A\|<1##, then you can do it this way. Otherwise, you will need to tell us what your definition of ##\ln A## is.
 
  • #9
LagrangeEuler said:
Thanks a lot. My problem is to calculate ##Tr(A\ln A)##, where ##A## is matrix. Because of that I thought to write ##\ln A## as power series if that is possilble. But here I can not do it in such easy way! Right?

Is A by any chance a density matrix (ie are you calculating the von Neumann entropy)? If so, its properties guarantee that you can use its spectral decomposition to write the trace as [itex]\mathrm{Tr} \rho \ln \rho = \sum_i \lambda_i \ln{\lambda_i}[/itex] for the eigenvalues [itex]\lambda_i[/itex] of the density matrix, which is often a useful form.
 
  • #10
DeIdeal said:
Is A by any chance a density matrix (ie are you calculating the von Neumann entropy)? If so, its properties guarantee that you can use its spectral decomposition to write the trace as [itex]\mathrm{Tr} \rho \ln \rho = \sum_i \lambda_i \ln{\lambda_i}[/itex] for the eigenvalues [itex]\lambda_i[/itex] of the density matrix, which is often a useful form.

When can I use spectral decomposition? Is it true only for Hermitian matrices, or not? Also what will happen in case of degenerated spectrum?
 
  • #11
LagrangeEuler said:
When can I use spectral decomposition? Is it true only for Hermitian matrices, or not? Also what will happen in case of degenerated spectrum?

It exists for all normal operators, so any Hermitian matrices are fine. For a degenerate spectrum, you can use the general spectral decomposition: [itex]A=\sum_{i=1}^{n}\sum_{d=1}^{D(i)} \lambda_i |\lambda_i,d\rangle\langle\lambda_i,d|[/itex], where D(i) is the degeneracy of eigenvalue [itex]\lambda_i[/itex], so that the vectors [itex]{|\lambda_i,d\rangle}[/itex] form an orthonormal basis on the corresponding D(i)-dimensional eigenspace.

Note that other properties of the density matrix would guarantee that the logarithm is always well-defined: The operator norm of a bounded normal operator equals its spectral radius, and if this is indeed a density operator, its matrix elements are non-negative and [itex]\sum_i \lambda_i = 1[/itex]. So, the condition micromass mentioned, ||A||<1, is fulfilled, unless the density matrix describes a pure state, that is, one of the eigenvalues of A equals one, in which case the von Neumann entropy vanishes.
 
  • #12
DeIdeal said:
It exists for all normal operators, so any Hermitian matrices are fine. For a degenerate spectrum, you can use the general spectral decomposition: [itex]A=\sum_{i=1}^{n}\sum_{d=1}^{D(i)} \lambda_i |\lambda_i,d\rangle\langle\lambda_i,d|[/itex], where D(i) is the degeneracy of eigenvalue [itex]\lambda_i[/itex], so that the vectors [itex]{|\lambda_i,d\rangle}[/itex] form an orthonormal basis on the corresponding D(i)-dimensional eigenspace.
In case of degeneracy I can not find eigenvectors in unique form. So I think that I can not find operator in unique form. For example look the case in ##\mathbb{R}^3##, ##\lambda_1=\lambda_2=\lambda_3=1## and eigenvectors
##x_1^T=(1 \quad 0 \quad 0)##,##x_2^T=(0 \quad 0 \quad 0)##, ##x_3^T=(0 \quad 0 \quad 0)##. How to find matrix in this case?
 
  • #13
Eigenvectors cannot be zero vectors.

(In addition, if we're still considering density matrices, that's not going to be a valid density matrix since the sum of eigenvalues is greater than 1.)
 
  • #15
DeIdeal said:
Eigenvectors cannot be zero vectors.

(In addition, if we're still considering density matrices, that's not going to be a valid density matrix since the sum of eigenvalues is greater than 1.)
Yes, most texts specify that eigenvectors must be non-zero. But that leaves us having to say "the set of all eigenvectors of a linear transformation, together with the zero vector, form a sub-space" and "the set of all eigenvectors corresponding to a given eigenvector, together with the zero vector, form a subspace."

A few texts say "[itex]\lambda[/itex] is an eigenvalue of linear transformation A if and only if there is a non-zero vector, v, such that [itex]Av= \lambda v[/itex]" and then "v is an eigenvector of A, corresponding to eigenvalue [itex]\lambda[/itex], if and only if [itex]Av= \lambda v[/itex]" which does NOT require that an eigenvector be non-zero. That allows us to say simply "the set of all eigenvectors of a linear transformation form a subspace" without having to add the zero vector. A small point but I see no reason to refuse to allow the zero vector as an eigenvector.
 
  • #16
HallsofIvy said:
Yes, most texts specify that eigenvectors must be non-zero. But that leaves us having to say "the set of all eigenvectors of a linear transformation, together with the zero vector, form a sub-space" and "the set of all eigenvectors corresponding to a given eigenvector, together with the zero vector, form a subspace."

A few texts say "[itex]\lambda[/itex] is an eigenvalue of linear transformation A if and only if there is a non-zero vector, v, such that [itex]Av= \lambda v[/itex]" and then "v is an eigenvector of A, corresponding to eigenvalue [itex]\lambda[/itex], if and only if [itex]Av= \lambda v[/itex]" which does NOT require that an eigenvector be non-zero. That allows us to say simply "the set of all eigenvectors of a linear transformation form a subspace" without having to add the zero vector. A small point but I see no reason to refuse to allow the zero vector as an eigenvector.

Really? Mind giving a reference? I'm not really skeptical, just curious, since I have never seen that convention being used in a text (either in linear algebra or in functional analysis), and this is the second time someone commented about that on a post I made. I guess the main difference is that you give up the uniqueness of the eigenvalue of a set eigenvector, although I wouldn't be surprised if more exceptions arise.
 
Last edited:
  • #17
I have never seen a math text that allows zero as an eigenvector. I would like to see some references for this too.
 
  • #18
In my usage, an eigenvector corresponding to an eigenvalue [itex]\lambda[/itex] is a non-zero [itex]v \in V[/itex] such that [itex]Av = \lambda v[/itex]. The eigenspace corresponding to the eigenvalue [itex]\lambda[/itex] is the set of all [itex]v \in V[/itex] such that [itex]Av = \lambda v[/itex]; the eigenspace is, as the name suggests, a subspace of [itex]V[/itex].
 

Related to Function of operators. Series, matrices.

1. What is the function of operators in mathematics?

Operators in mathematics are symbols or characters that are used to perform specific mathematical operations, such as addition, subtraction, multiplication, and division. They are essential for solving equations and performing calculations.

2. How are operators used in series?

In series, operators are used to define the sequence of terms and to indicate how each term is related to the previous one. For example, the summation operator ∑ is used to indicate that each term in the series is added to the previous one.

3. What is the role of operators in matrices?

In matrices, operators are used to perform operations on the elements of the matrix, such as addition, subtraction, multiplication, and division. They are also used to manipulate the rows and columns of a matrix, such as swapping or scaling them.

4. How do operators affect the outcome of a function?

The choice of operator used in a function can greatly impact the outcome. Different operators have different precedence and associativity, which can change the order in which operations are performed. It is important to understand the function of each operator in order to accurately interpret the results of a function.

5. Can operators be combined in a mathematical expression?

Yes, operators can be combined in a mathematical expression to perform multiple operations in a single equation. The order in which the operators are used can greatly affect the final result, so it is important to follow the proper mathematical conventions when combining operators.

Similar threads

Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
2
Views
743
  • Linear and Abstract Algebra
Replies
8
Views
850
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
516
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
3
Views
779
Replies
7
Views
866
Back
Top