Are Unitary Operator Eigenvalues Always Modulus 1 and Eigenvectors Orthogonal?

Lawrencel2
Messages
81
Reaction score
0

Homework Statement


I know that Unitary operators act similar to hermitean operators.
I want to prove that the eigenvalues of unitary operators are complex numbers of modulus 1, and that Unitary operators produce orthogonal eigenvectors.

Homework Equations


UU = I
U-1=U
λ = eiΦ{/SUP] (eigenvalue form)


The Attempt at a Solution


For Eigenvalues being modulus 1, I wasn't sure if i started far enough back, but i have:
  • I read in Sakurai that Eigenvalues of unitary operators have form λ = eiΦ{/SUP]

U |a> = λ |a>
<a| U = λ*<a|
<a| U U |a> = λ*λ<a|a>
<a|I|a> = |λ|2<a|a>
  • But then I get to the point where I am already trying to assume Orthonormality and not showing it is complex.
1 = |λ|2 where <a|a> = 1
  • The only other way i can assume to to show this, is by asserting that the form IS: λ = eiΦ{/SUP]. Which then i can expand with Euler's Formula and use the basic Mod sq formatting. This gives me 1. At this point I don't know where to start when trying to show orthogonality...
    [*]We used this argument in lecture to show orthogonality of a hermitian matrix

<a'|A|a>- <a'|A|a> = (λ*-λ)<a'|a>
And since A is self adjoint The LHS equals zero..
Any tips on how i should be viewing this?​
 
Physics news on Phys.org
Lawrencel2 said:

Homework Statement


I know that Unitary operators act similar to hermitean operators.
I want to prove that the eigenvalues of unitary operators are complex numbers of modulus 1, and that Unitary operators produce orthogonal eigenvectors.

Homework Equations


UU = I
U-1=U
λ = e (eigenvalue form)​

The Attempt at a Solution


For Eigenvalues being modulus 1, I wasn't sure if i started far enough back, but i have:
  • I read in Sakurai that Eigenvalues of unitary operators have form λ = e
U |a> = λ |a>
<a| U = λ*<a|
<a| U U |a> = λ*λ<a|a>
<a|I|a> = |λ|2<a|a>
  • But then I get to the point where I am already trying to assume Orthonormality and not showing it is complex.
No: so far you have used ##\ U^\dagger = U^{-1}## to conclude ##\lambda^*\lambda = 1##
1 = |λ|2 where <a|a> = 1
  • The only other way i can assume to to show this, is by asserting that the form IS: λ = e. Which then i can expand with Euler's Formula and use the basic Mod sq formatting. This gives me 1. At this point I don't know where to start when trying to show orthogonality...
  • We used this argument in lecture to show orthogonality of a hermitian matrix
<a'|A|a>- <a'|A|a> = (λ*-λ)<a'|a>
And since A is self adjoint The LHS equals zero..
Any tips on how i should be viewing this?​
(with a magnifying glass, all these ##[##SUP##]## ! :) )

Corny, sorry.

Your lefthand side is zero, your righthand side is ##(\lambda '^* - \lambda) <a'|a>## (not ##(\lambda ^* - \lambda)...## ) .
So one of the two factors must be zero.
From there, you can proceed to demonstrating orthogonality​
 
Sorry, I have no clue how all those superscripts got in there. I tried to remove them.

As per your note, I see now that I have used that to show the modulus sq is one, but I think i have failed to show that it is complex? Only other way I can imagine showing that, is assuming it has the e^(i λ φ), and using Euler's to expand it.

I was trying to start with the same argument they did with showing hermitian operators were Orthogonal (where I used operator A)..

I tried to follow the same steps and i arrive at the same basic eq:
<a'|U|a> = λ<a'|a> ; <a'|U|a> = λ' * <a'|a>
Subtracting the two
<a'|U|a> - <a'|U|a> =( λ' *-λ)<a'|a>
Now, because i know they ARE orthogonal, I should be able to make an argument about the LHS equaling zero, but I can't say why it is..
 
First, eigenvalues are in general complex - you only need to prove that the modulus is one, which you have. For the orthogonality of the eigen-basis, just due the exact same thing you did to show that the modulus of the eigenvalues is one - except use two different eigenvectors. You get a relation of the form x = ( # different than 1)*x, and there is only one value of x for which this holds true.

edit: of course, this is assuming non-degeneracy
 
Last edited:
DelcrossA said:
First, eigenvalues are in general complex - you only need to prove that the modulus is one, which you have. For the orthogonality of the eigen-basis, just due the exact same thing you did to show that the modulus of the eigenvalues is one - except use two different eigenvectors. You get a relation of the form x = ( # different than 1)*x, and there is only one value of x for which this holds true.

edit: of course, this is assuming non-degeneracy
Only thing is, where you call it "# different than 1" isn't. λ'* λ works out to be one due to its Euler's type form
 
Lawrencel2 said:
Only thing is, where you call it "# different than 1" isn't. λ'* λ works out to be one due to its Euler's type form

How? The two Eigenvalues are different and so the phases are different.
 
DelcrossA said:
How? The two Eigenvalues are different and so the phases are different.
right, so when you multiply the the Cos()+i Sin() to the other, the sum-difference formula just takes it in as a new angle.. Or just adding the exponents while its in the form of an exponential.
 
ooooo But you make a point that I glanced over. let's see if it helps. What i said was correct, but what you said was also correct.. so thank you.. ill see if it works
 
So you are saying that exp(iθ) is always equal to 1 regardless of θ? That isn't true
 
  • #10
DelcrossA said:
So you are saying that exp(iθ) is always equal to 1 regardless of θ? That isn't true
no, that modulus of exp(i(theta) is always 1. But I worded some things wrong and convinced myself of the error i had written. Thank you for that enlightenment. Idk how i convinced myself of what I had on my paper. I blame it on the caffeine.
 
  • #11
No problem - mix ups like that happen to me all the time.
 
Back
Top