# Find the normalized eigenspinors of the 2 x 2 matrix

Is my work correct so far? I'm not sure how to solve for the + eigenspinor.

Since n is a unit vector, I took the n and sigma dot product to b e one since the sigma components are each one. This implies lambda is +/-1.

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181327.jpg?t=1290298911 [Broken]

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181405.jpg?t=1290298924 [Broken]

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181422.jpg?t=1290298935 [Broken]

Sorry for the poor quality.

Last edited by a moderator:

Related Advanced Physics Homework Help News on Phys.org
It can be done simply with some little tricks appropriate to your particular problem. First notice that your matrix is hermitian with trace 0 and determinant -1. The sum of eigenvalues is 0, their product is -1. So the two eigenvalues are 1,-1. Call your matrix U.
Then $$U^2=I$$. Therefore U(I-U)=-(I-U) and U(I+U)=I+U. It follows that any vector of the form (I+U)x will be an eigenvector to the eigenvalue +1, and any vector of the form (I-U)x will be eigenvector to the eigenvalue -1. Now chose the simplest possible x and normalize. Work done.

I guess the author of this problem was expecting you to do it in a hard way. But you can also do it in the above easy way. It's fun!

It can be done simply with some little tricks appropriate to your particular problem. First notice that your matrix is hermitian with trace 0 and determinant -1. The sum of eigenvalues is 0, their product is -1. So the two eigenvalues are 1,-1. Call your matrix U.
Then $$U^2=I$$. Therefore U(I-U)=-(I-U) and U(I+U)=I+U. It follows that any vector of the form (I+U)x will be an eigenvector to the eigenvalue +1, and any vector of the form (I-U)x will be eigenvector to the eigenvalue -1. Now chose the simplest possible x and normalize. Work done.

I guess the author of this problem was expecting you to do it in a hard way. But you can also do it in the above easy way. It's fun!
Okay. I worked out the Trace and Determinant. I guess there's a rule relating the eigenvalues to the trace and determinant in this case. I'm a bit rusty on linear algebra.

After that, I'm not following you. What do I make U and importantly x?

Each hermitian matrix A can be diagonalized by a unitary matrix U so that UAU*=D is diagonal. On the diagonal you have then eigenvalues d1,d2,...,dn (for nxn matrix). But
det(UAU*)=det(A) and Tr(UAU*=Tr(A). Now det(D)=d1....dn, and Tr(D)=d1+...dn. It follows that det(A) is the product of eigenvalues of A and Tr(A) is the sum of its eigenvalues. For 2x2 matrix this way we can calculate the product d1d2=Tr(A) and the sum d1+d2=Tr(A). These two equations suffice (I repeat: for a 2x2 matrix) to calculate d2 and d2.

As for the rest: you want to calculate eigenvectors of your matrix, and you already know that eigenvalues are 1,-1. You take any vector, I wrote x, but it can mislead you, so, call it $$\psi$$ and try, for instance

$$\psi=\begin{pmatrix}1\\0\end{pmatrix}$$

Then apply to it I+U, where U is your matrix. The result, if non-zero, will be the eigenvector to the eigenvalue +1. If you apply to it (I-U) - the result will be the eigenvector to the eigenvalue -1. You will have to normalize them, but that should be easy.

But this particular trick works because you particular matrix has unusual properties - it is both hermitian and unitary: U=U* and UU*=I. Therefore U^2=I. In fact, that its eigenvalues are +1,-1 follows from these properties alone and the fact that your matrix is not just the identity matrix.

I know that if you are not so good with algebra - all the above will still rise your questions. But there is nothing difficult in the above and worth learning. It will save you work in the future. So, if you have further questions, if you are lost - ask for the details.

That's the next problem. It wants us to use the matrix U of the +/- eigenvectors from this problem and diagonalize the original hermitian matrix.

Do you know how to do it using the two explicit linear equations?

Last edited:
That's the next problem. It wants us to use the matrix U of the +/- eigenvectors from this problem and diagonalize the original hermitian matrix.

Do you know how to do it using the two explicit linear equations?
Did you find your eigenvectors? Do it first. And because you want to find the diagonalizing matrix, which you call U, do not call your original matrix U (as I did). Call it, for instance, V.

Anyway - find your eigenvectors first, make sure by a direct check, that they are eigenvectors, make sure that they are normalized.

From these eigenvectors you will build the diagonalizing matrix.

Did you find your eigenvectors? Do it first. And because you want to find the diagonalizing matrix, which you call U, do not call your original matrix U (as I did). Call it, for instance, V.

Anyway - find your eigenvectors first, make sure by a direct check, that they are eigenvectors, make sure that they are normalized.

From these eigenvectors you will build the diagonalizing matrix.
Finding the eigenvectors is where I'm stuck on the paper (third link). It's exactly where your method took me. The book's eigenvectors are found using that system of equations.

The +1 eigenvector:

$$(I+V)\psi=\begin{pmatrix}1+\cos\, \alpha&\sin\, \alpha\,e^{-i\beta}\cr\sin\,\alpha\,e^{i\beta}&1-\cos\,\alpha\end{pmatrix}\begin{pmatrix}1&\\0\end{pmatrix}$$

$$=\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}$$

$$||(I+V)\psi||^2=(1+\cos\,\alpha)^2+\sin^2\,\alpha=2(1+\cos\,\alpha)=4\cos^2\,\alpha/2$$

$$||(I+V)\psi||=2|\cos\,\alpha/2|$$

$$\xi_1=\frac{1}{2|\cos\,\alpha/2|}\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}$$

The +1 eigenvector:

$$(I+V)\psi=\begin{pmatrix}1+\cos\, \alpha&\sin\, \alpha\,e^{-i\beta}\cr\sin\,\alpha\,e^{i\beta}&1-\cos\,\alpha\end{pmatrix}\begin{pmatrix}1&\\0\end{pmatrix}$$

$$=\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}$$

$$||(I+V)\psi||^2=(1+\cos\,\alpha)^2+\sin^2\,\alpha=2(1+\cos\,\alpha)=4\cos^2\,\alpha/2$$

$$||(I+V)\psi||=2|\cos\,\alpha/2|$$

$$\xi_1=\frac{1}{2|\cos\,\alpha/2|}\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}$$
Is psi a "test function"? And the (I + V) and (I - V) are derived from U^2 = I?

You matrix, now I call it V, not U, has the property V=V*, VV*=1. It follows that P=(I+V)/2 has the property P=P*=PP. These are properties of an "orthogonal projection". It projects the whole space onto the eigenspace of V with the eigenvalue +1. You can check, but pure algebra, that VP=P. That is VP psi=P psi for any vector psi. That is any vector of the form P psi is an eigenvector of V with eigenvalue +1. The vector psi - there is no reason to call it a test vector, can be arbitrary, the only thing we need to care about is that P psi is not zero. So I have chosen a particularly simple psi. You can try

$$=\psi=\begin{pmatrix}0&1\end{pmatrix}$$

and convince yourself that the end result will be the same. Or you can choose

$$\psi=\begin{pmatrix}1&1\end{pmatrix}.$$

Again, the end result will be the same. P projects onto a 1-dimensional subspace, so any two projections will be proportional. After normalizing they will be the same up to a constant multiplicative factor of modulus 1.

It all follows from the simple algebra that you can prove:

Let P=(1+V)/2. Then P=P*=P^2 if and only if V=V*,VV*=I.
Let Q=(I-V)/2. Then Q=Q*=Q^2 if and only if VV*,VV*=I.

In this case

P+Q=I
PQ=QP=0
VP=P
VQ=-Q

And then the textbook observation: P is an orthogonal projection if and only if P=P*=P^2 (hermitian idempotent)

With some little algebra experience all the above properties are evident, but when you see them the first time, they look like magic.

Another observation: since V has eigenvalues +1,-1 then P=(I+V)/2 will have eigenvalues
(1+1)/2 and (1-1)/2 that is 1 and 0 - that is it is a projection. Then Q=I-P is the complementary projection.

Now you can take my results for the normalized eigenvector and compare with the one you get with solving linear equations etc. - the hard way. They should agree up to a multiplicative constant. There is always this freedom.