Find the normalized eigenspinors of the 2 x 2 matrix

  • Thread starter Shackleford
  • Start date
  • Tags
    Matrix
In summary: I found my eigenvectors! Do it now. And because you want to find the diagonalizing matrix, which you call U, do not call your original matrix U (as I did). Call it, for instance, V.Anyway - find your eigenvectors first, make sure by a direct check, that they are eigenvectors, make sure that they are normalized.
  • #1
Shackleford
1,656
2
Is my work correct so far? I'm not sure how to solve for the + eigenspinor.

Since n is a unit vector, I took the n and sigma dot product to b e one since the sigma components are each one. This implies lambda is +/-1.

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181327.jpg?t=1290298911

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181405.jpg?t=1290298924

http://i111.photobucket.com/albums/n149/camarolt4z28/2010-11-20181422.jpg?t=1290298935

Sorry for the poor quality.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
It can be done simply with some little tricks appropriate to your particular problem. First notice that your matrix is hermitian with trace 0 and determinant -1. The sum of eigenvalues is 0, their product is -1. So the two eigenvalues are 1,-1. Call your matrix U.
Then [tex]U^2=I[/tex]. Therefore U(I-U)=-(I-U) and U(I+U)=I+U. It follows that any vector of the form (I+U)x will be an eigenvector to the eigenvalue +1, and any vector of the form (I-U)x will be eigenvector to the eigenvalue -1. Now chose the simplest possible x and normalize. Work done.

I guess the author of this problem was expecting you to do it in a hard way. But you can also do it in the above easy way. It's fun!
 
  • #3
arkajad said:
It can be done simply with some little tricks appropriate to your particular problem. First notice that your matrix is hermitian with trace 0 and determinant -1. The sum of eigenvalues is 0, their product is -1. So the two eigenvalues are 1,-1. Call your matrix U.
Then [tex]U^2=I[/tex]. Therefore U(I-U)=-(I-U) and U(I+U)=I+U. It follows that any vector of the form (I+U)x will be an eigenvector to the eigenvalue +1, and any vector of the form (I-U)x will be eigenvector to the eigenvalue -1. Now chose the simplest possible x and normalize. Work done.

I guess the author of this problem was expecting you to do it in a hard way. But you can also do it in the above easy way. It's fun!

Okay. I worked out the Trace and Determinant. I guess there's a rule relating the eigenvalues to the trace and determinant in this case. I'm a bit rusty on linear algebra.

After that, I'm not following you. What do I make U and importantly x?
 
  • #4
Each hermitian matrix A can be diagonalized by a unitary matrix U so that UAU*=D is diagonal. On the diagonal you have then eigenvalues d1,d2,...,dn (for nxn matrix). But
det(UAU*)=det(A) and Tr(UAU*=Tr(A). Now det(D)=d1...dn, and Tr(D)=d1+...dn. It follows that det(A) is the product of eigenvalues of A and Tr(A) is the sum of its eigenvalues. For 2x2 matrix this way we can calculate the product d1d2=Tr(A) and the sum d1+d2=Tr(A). These two equations suffice (I repeat: for a 2x2 matrix) to calculate d2 and d2.

As for the rest: you want to calculate eigenvectors of your matrix, and you already know that eigenvalues are 1,-1. You take any vector, I wrote x, but it can mislead you, so, call it [tex]\psi[/tex] and try, for instance

[tex]\psi=\begin{pmatrix}1\\0\end{pmatrix}[/tex]

Then apply to it I+U, where U is your matrix. The result, if non-zero, will be the eigenvector to the eigenvalue +1. If you apply to it (I-U) - the result will be the eigenvector to the eigenvalue -1. You will have to normalize them, but that should be easy.

But this particular trick works because you particular matrix has unusual properties - it is both hermitian and unitary: U=U* and UU*=I. Therefore U^2=I. In fact, that its eigenvalues are +1,-1 follows from these properties alone and the fact that your matrix is not just the identity matrix.

I know that if you are not so good with algebra - all the above will still rise your questions. But there is nothing difficult in the above and worth learning. It will save you work in the future. So, if you have further questions, if you are lost - ask for the details.
 
  • #5
That's the next problem. It wants us to use the matrix U of the +/- eigenvectors from this problem and diagonalize the original hermitian matrix.

Do you know how to do it using the two explicit linear equations?
 
Last edited:
  • #6
Shackleford said:
That's the next problem. It wants us to use the matrix U of the +/- eigenvectors from this problem and diagonalize the original hermitian matrix.

Do you know how to do it using the two explicit linear equations?

Did you find your eigenvectors? Do it first. And because you want to find the diagonalizing matrix, which you call U, do not call your original matrix U (as I did). Call it, for instance, V.

Anyway - find your eigenvectors first, make sure by a direct check, that they are eigenvectors, make sure that they are normalized.

From these eigenvectors you will build the diagonalizing matrix.
 
  • #7
arkajad said:
Did you find your eigenvectors? Do it first. And because you want to find the diagonalizing matrix, which you call U, do not call your original matrix U (as I did). Call it, for instance, V.

Anyway - find your eigenvectors first, make sure by a direct check, that they are eigenvectors, make sure that they are normalized.

From these eigenvectors you will build the diagonalizing matrix.

Finding the eigenvectors is where I'm stuck on the paper (third link). It's exactly where your method took me. The book's eigenvectors are found using that system of equations.
 
  • #8
The +1 eigenvector:

[tex](I+V)\psi=\begin{pmatrix}1+\cos\, \alpha&\sin\, \alpha\,e^{-i\beta}\cr\sin\,\alpha\,e^{i\beta}&1-\cos\,\alpha\end{pmatrix}\begin{pmatrix}1&\\0\end{pmatrix}[/tex]

[tex]=\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}[/tex]

[tex]||(I+V)\psi||^2=(1+\cos\,\alpha)^2+\sin^2\,\alpha=2(1+\cos\,\alpha)=4\cos^2\,\alpha/2[/tex]

[tex]||(I+V)\psi||=2|\cos\,\alpha/2|[/tex]

[tex]\xi_1=\frac{1}{2|\cos\,\alpha/2|}\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}[/tex]
 
  • #9
arkajad said:
The +1 eigenvector:

[tex](I+V)\psi=\begin{pmatrix}1+\cos\, \alpha&\sin\, \alpha\,e^{-i\beta}\cr\sin\,\alpha\,e^{i\beta}&1-\cos\,\alpha\end{pmatrix}\begin{pmatrix}1&\\0\end{pmatrix}[/tex]

[tex]=\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}[/tex]

[tex]||(I+V)\psi||^2=(1+\cos\,\alpha)^2+\sin^2\,\alpha=2(1+\cos\,\alpha)=4\cos^2\,\alpha/2[/tex]

[tex]||(I+V)\psi||=2|\cos\,\alpha/2|[/tex]

[tex]\xi_1=\frac{1}{2|\cos\,\alpha/2|}\begin{pmatrix}1+\cos\, \alpha\cr\sin\,\alpha\,e^{i\beta}\end{pmatrix}[/tex]

Is psi a "test function"? And the (I + V) and (I - V) are derived from U^2 = I?
 
  • #10
You matrix, now I call it V, not U, has the property V=V*, VV*=1. It follows that P=(I+V)/2 has the property P=P*=PP. These are properties of an "orthogonal projection". It projects the whole space onto the eigenspace of V with the eigenvalue +1. You can check, but pure algebra, that VP=P. That is VP psi=P psi for any vector psi. That is any vector of the form P psi is an eigenvector of V with eigenvalue +1. The vector psi - there is no reason to call it a test vector, can be arbitrary, the only thing we need to care about is that P psi is not zero. So I have chosen a particularly simple psi. You can try

[tex]=\psi=\begin{pmatrix}0&1\end{pmatrix}[/tex]

and convince yourself that the end result will be the same. Or you can choose

[tex]\psi=\begin{pmatrix}1&1\end{pmatrix}.[/tex]

Again, the end result will be the same. P projects onto a 1-dimensional subspace, so any two projections will be proportional. After normalizing they will be the same up to a constant multiplicative factor of modulus 1.

It all follows from the simple algebra that you can prove:

Let P=(1+V)/2. Then P=P*=P^2 if and only if V=V*,VV*=I.
Let Q=(I-V)/2. Then Q=Q*=Q^2 if and only if VV*,VV*=I.

In this case

P+Q=I
PQ=QP=0
VP=P
VQ=-Q

And then the textbook observation: P is an orthogonal projection if and only if P=P*=P^2 (hermitian idempotent)

With some little algebra experience all the above properties are evident, but when you see them the first time, they look like magic.

Another observation: since V has eigenvalues +1,-1 then P=(I+V)/2 will have eigenvalues
(1+1)/2 and (1-1)/2 that is 1 and 0 - that is it is a projection. Then Q=I-P is the complementary projection.

Now you can take my results for the normalized eigenvector and compare with the one you get with solving linear equations etc. - the hard way. They should agree up to a multiplicative constant. There is always this freedom.
 

FAQ: Find the normalized eigenspinors of the 2 x 2 matrix

What is the definition of eigenspinors?

Eigenspinors are the eigenvectors of a matrix that represent the direction and magnitude of the corresponding eigenvalues.

Why is it important to find the normalized eigenspinors of a matrix?

Normalized eigenspinors provide a complete and orthonormal basis for the vector space of the matrix, which is crucial in many applications such as quantum mechanics and linear transformations.

How do you calculate the normalized eigenspinors of a 2 x 2 matrix?

To find the normalized eigenspinors, first calculate the eigenvalues of the matrix. Then, for each eigenvalue, solve the system of linear equations formed by setting the matrix equal to a multiple of the identity matrix. Normalize the resulting eigenvectors to obtain the normalized eigenspinors.

Can a matrix have more than two normalized eigenspinors?

Yes, a matrix can have any number of normalized eigenspinors depending on the number of distinct eigenvalues it has.

How can the normalized eigenspinors of a matrix be used in practical applications?

The normalized eigenspinors can be used to find the diagonal form of a matrix, which simplifies calculations and allows for easier interpretation of the matrix's properties. They are also important in solving differential equations and analyzing the behavior of quantum systems.

Back
Top