# Eigenvalue of the sum of two non-orthogonal (in general) ket-bras

• A
Jufa
TL;DR Summary
In a maths course the following statement is claimed to be self-evident, and I don't find it so.
We have a matrix ##M = \ket{\psi^{\perp}}\bra{\psi^{\perp}} + \ket{\varphi^{\perp}}\bra{\varphi^{\perp}}##

The claim is that the eigenvalues of such a matrix are ##\lambda_{\pm}= 1\pm |\bra{\psi}\ket{\varphi}|##

Can someone proof this claim? I have been told it is self-evident but I've been already a couple of days struggling with it.

Homework Helper
Gold Member
2022 Award
I'm not sure it's self-evident. You could look for an eigenvector of the form ##|\psi \rangle + \beta |\phi \rangle## and deduce an expression for ##\lambda - 1##.

PS That works!

Homework Helper
Gold Member
2022 Award
Another idea is to use the orthonormal basis generated by the standard Gram-Schmidt process. Where you replace ##|\phi \rangle## by the normalised vector in the direction
$$|\phi \rangle - |\psi\rangle \langle \psi | \phi \rangle$$

PS or simply express the matrix in the ##|\psi \rangle, |\phi \rangle## basis and generate the characteristic equation. I think that's the simplest way.

Last edited:
Jufa
Another idea is to use the orthonormal basis generated by the standard Gram-Schmidt process. Where you replace ##|\phi \rangle## by the normalised vector in the direction
$$|\phi \rangle - |\psi\rangle \langle \psi | \phi \rangle$$

PS or simply express the matrix in the ##|\psi \rangle, |\phi \rangle## basis and generate the characteristic equation. I think that's the simplest way.
I guess when you say ##\ket{\phi}## you mean ##\ket{\varphi}##. I don't think it makes sense to compute the entries of the matrix in a non-orthonormal basis. I don't think the associated characteristic equation has nothing to do with the one you get when an orthonormal basis is considered.

Homework Helper
Gold Member
2022 Award
I don't think the associated characteristic equation has nothing to do with the one you get when an orthonormal basis is considered.
Are you sure?

Homework Helper
Gold Member
2022 Award
Here's a simple proof that eigenvalues are basis-invariant.

First note that the definition of an eigenvalue for a linear transformation does not refer to a basis:
$$Tv = \lambda v$$In order for this to be well-defined, it must be the case that if ##T## is represented by the matrix ##M## in some basis and ##v## by the column vector ##x## in that basis, then the following must hold:$$Mx = \lambda x$$In any case, if we assume a change of basis using the transformation matrix ##S##, such that:$$M' = S^{-1}MS, \ \text{and} \ ,x' = S^{-1}x$$Where ##M'## and ##x'## are the matrix and vector expressed in the new basis, then:
$$M'x' = S^{-1}Mx = \lambda S^{-1}x = \lambda x'$$Which confirms that eigenvalues must be basis invariant.

Jufa
Here's a simple proof that eigenvalues are basis-invariant.

First note that the definition of an eigenvalue for a linear transformation does not refer to a basis:
$$Tv = \lambda v$$In order for this to be well-defined, it must be the case that if ##T## is represented by the matrix ##M## in some basis and ##v## by the column vector ##x## in that basis, then the following must hold:$$Mx = \lambda x$$In any case, if we assume a change of basis using the transformation matrix ##S##, such that:$$M' = S^{-1}MS, \ \text{and} \ ,x' = S^{-1}x$$Where ##M'## and ##x'## are the matrix and vector expressed in the new basis, then:
$$M'x' = S^{-1}Mx = \lambda S^{-1}x = \lambda x'$$Which confirms that eigenvalues must be basis invariant.
Oh yes. You are definitely right, I am sorry for my confusion. I will work on your idea then. Thank you very much.

Homework Helper
Gold Member
2022 Award
Oh yes. You are definitely right, I am sorry for my confusion. I will work on your idea then. Thank you very much.
You perhaps out to post one of your attempts. The eigenvalues can be calculated in a few lines starting with the general form of the eigenvector: ##\alpha |\psi \rangle + \beta |\varphi \rangle##

First, we can see that ##\alpha, \beta \ne 0## and therefore may take ##\alpha = 1## (as any scalar multiple of an eigenvector is still an eigenvector).

Next we solve the equation:
$$(\ket{\psi}\bra{\psi} + \ket{\varphi}\bra{\varphi})(|\psi \rangle + \beta |\varphi \rangle) = \lambda(|\psi \rangle + \beta |\varphi \rangle)$$And the result follows after a few lines of algebra.

That said, I'm not sure what the perpendicular symbol means in this context? It works out assuming the matrix is the outer product of the vectors, which I assumed to be normalised.

Jufa
I maybe should have mentioned that both ket vectors belong to ##\mathcal{C}^2## and thus the perpendicular vectors are well defined up to a global phase. My attempt is the following:

In the basis ## \ket{\psi}, \ket{\varphi}## the matrix looks like a diagonal one, namely:

##M = Diag\Big( |\bra{\psi}\ket{\varphi^\perp}|^2, |\bra{\varphi}\ket{\psi^\perp}|^2 \Big)##

Therefore the eigenvalues are nothing but these diagonal entries. This result does not seem to have nothing to do with:

##\lambda_{\pm} = 1 \pm |\bra{\psi} \ket{\varphi}|##

Homework Helper
Gold Member
2022 Award
I maybe should have mentioned that both ket vectors belong to ##\mathcal{C}^2## and thus the perpendicular vectors are well defined up to a global phase. My attempt is the following:

In the basis ## \ket{\psi}, \ket{\varphi}## the matrix looks like a diagonal one, namely:

##M = Diag\Big( |\bra{\psi}\ket{\varphi^\perp}|^2, |\bra{\varphi}\ket{\psi^\perp}|^2 \Big)##

Therefore the eigenvalues are nothing but these diagonal entries. This result does not seem to have nothing to do with:

##\lambda_{\pm} = 1 \pm |\bra{\psi} \ket{\varphi}|##
I must admit I don't understand what is meant by the perpendicular vectors in this context.

In any case, I don't see how you gave diagonalised ##M##.

Jufa
I must admit I don't understand what is meant by the perpendicular vectors in this context.

In any case, I don't see how you gave diagonalised ##M##.
Just an orthogonal vector.

Homework Helper
Gold Member
2022 Award
Just an orthogonal vector.
There is no such thing. "Orthogonal" is a property of two vectors.

Your fundamental problem here may be that you do not understand what the original statement means. You must be able to give me a clear and unambiguous definition of ##\ket{\varphi^\perp}##. A good test of whether you understand a problem is whether you can explain it to someone else.

Jufa
There is no such thing. "Orthogonal" is a property of two vectors.

Your fundamental problem here may be that you do not understand what the original statement means. You must be able to give me a clear and unambiguous definition of ##\ket{\varphi^\perp}##. A good test of whether you understand a problem is whether you can explain it to someone else.
Yes. Namely between ##\ket{\psi}## and ##\ket{\psi^\perp}##. These are the two vectors involved. I reckon I am stating the problem in a clear way.