Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Linear algebra, orthogonal matrix proof

  1. Aug 14, 2011 #1

    fluidistic

    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data
    Demonstrate that the following propositions hold if A is an nxn real and orthogonal matrix:
    1)If [itex]\lambda[/itex] is a real eigenvalue of A then [itex]\lambda =1[/itex] or [itex]-1[/itex].
    2)If [itex]\lambda[/itex] is a complex eigenvalue of A, the conjugate of [itex]\lambda[/itex] is also an eigenvalue of A.


    2. Relevant equations
    For part 1) I used the fact that A orthogonal implies [itex]A^{-1}=A^T[/itex]. Also that [itex]\det A = \frac{1}{det A^{-1}}[/itex], and that [itex]\det A = \det A^T[/itex]. I didn't demonstrate the 2 latter relations, I just assumed them to be true.


    3. The attempt at a solution
    I've done part 1), I'm just lazy to write down the detailed proof. I made use of the relevant equations I've written.
    However for 2) I'm totally stuck at even planting the problem. I started by writing [itex]\lambda = a+ib[/itex] and its conjugate [itex]\overline \lambda = a-ib[/itex] but this lead me nowhere and stuck immediately.
    I think I've read in wikipedia yesterday that the eigenvalues of an orthogonal matrices all have modulus 1. If I remember well, A is diagonalizable and put under this form I should "see that all eigenvalues of A have modulus 1".
    If this is the simpler approach, please let me know. I could demonstrate that A is diagonalizable first and then try to go further.
    In the demonstration in 1), at one point I reach the fact that [itex]\lambda A ^T =I=\lambda A[/itex]. So clearly A is... diagonal (this is not necessarily true so this looks like a mistake... damn it.)

    My proof for 1):
    I must show that [itex]Ax=\lambda x \Rightarrow \lambda = \pm 1 \forall x \in \mathbb{R}^n[/itex].
    I multiply both sides by the inverse of A: [itex]A^{-1}Ax= A ^{-1} \lambda x \Rightarrow x= \lambda A^{-1}x[/itex].
    [itex]\Rightarrow \lambda A^{-1}=I \Rightarrow \det A^{-1}=\frac{1}{\lambda }[/itex]. But we also have that [itex]\lambda A^T=I[/itex] because A is orthogonal. Since A is a square matrix, [itex]\det A =\det A^T[/itex] too. Thus we have that [itex]\det A =\det A^T \Rightarrow \det (\lambda A ^T )=1 \Rightarrow \det (\lambda A )=1 \Rightarrow \det A = \frac{1}{\lambda}[/itex].
    [itex]\Rightarrow \det A = \det A^{-1}[/itex]. But for any invertible matrix A, [itex]\det A = \frac{1}{\det A ^{-1}}[/itex].
    So that if [itex]a = \det A[/itex], then [itex]a= \frac{1}{a} \Rightarrow a^2=1 \Rightarrow a = \pm 1[/itex]. And since [itex]\lambda = \frac{1}{\det A}[/itex], I reach [itex]\lambda = \pm 1[/itex].
    Any tip/help will be appreciated. Thanks.
     
  2. jcsd
  3. Aug 14, 2011 #2

    I like Serena

    User Avatar
    Homework Helper

    Hi fluidistic! :smile:

    Let's start with your proof for 1).

    You infer [itex]x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I[/itex].
    I'm afraid this is not generally true.
    The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.

    As a hint for 1): what is |Ax|?

    As a hint for 2): suppose [itex]Av = \lambda v[/itex], what is the conjugate of (Av)?
     
  4. Aug 14, 2011 #3
    [itex] \lambda A^{-1} [/itex] mapping all eigenvectors of A to themselves does not imply [itex] \lambda A^{-1} = I. [/itex]
     
  5. Aug 14, 2011 #4

    fluidistic

    User Avatar
    Gold Member

    Hi and thanks to both of you guys.
    Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

    For 1), is it [itex]|\lambda x|[/itex]?
    For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach [itex]\lambda x[/itex] with [itex]\lambda[/itex] being complex valued.
    I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
    Thanks for any further push.
     
  6. Aug 14, 2011 #5

    I like Serena

    User Avatar
    Homework Helper

    Try [itex]\begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}[/itex].


    Noooo, I didn't say x was an eigenvector. :rolleyes:
    Try to use the properties of an orthogonal matrix.


    Your problem doesn't say that x has to be real.
    And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
    As you can see I avoided using x here.
    I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.
     
  7. Aug 14, 2011 #6

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
    ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.
     
  8. Aug 14, 2011 #7

    fluidistic

    User Avatar
    Gold Member

    Well I reach [itex]\begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}[/itex]. So unless I'm mistaken this matrix doesn't work.


    Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.




    I think I can prove it. I was having problems because I assumed x was in R^n.
    If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

    You mean x any complex and real valued?

    Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.

    [itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] where <,> denotes the inner product. Or do you mean a specific norm?
     
  9. Aug 14, 2011 #8

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    But it will work for specific vectors
    \begin{align*}
    \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ 1\end{pmatrix} &= (1)\begin{pmatrix} 1 \\ 1\end{pmatrix} \\
    \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ -1\end{pmatrix} &= (-1)\begin{pmatrix} 1 \\ -1\end{pmatrix}
    \end{align*}
    If you demand that every vector x in Rn satisfies Ax=λx, then you're right that A is a multiple of the identity matrix.
    I'm not sure what you're saying here. :)
    Now using the fact that A is orthogonal, you can show |Ax|=|x|, which is what I think ILS was trying to get you to see. Now suppose x is an eigenvector of A.
     
  10. Aug 15, 2011 #9

    I like Serena

    User Avatar
    Homework Helper

    I see vela has already given the answers that you need to move on.

    I'll just give a couple of additional comments. :smile:


    Yep! That's it. :)

    Yep!

    Whatever.

    Yes, I meant this one.
    When not specified, this one is always meant.
     
  11. Aug 15, 2011 #10

    fluidistic

    User Avatar
    Gold Member

    Thanks once again guys!
    Well I'm stuck at showing that [itex]||Ax||=||x||[/itex]. I know I have to use the fact that [itex]||Ax||=\sqrt {<Ax,Ax>}[/itex] and that A is orthogonal ([itex]A^T=A^{-1}[/itex]).
    I'm thinking about using some properties of the inner product but I can't find any interesting for this case.
     
  12. Aug 15, 2011 #11

    I like Serena

    User Avatar
    Homework Helper

    Ah, well, I didn't really intend for you to proof it.
    As far as I'm concerned it's simply a property of an orthogonal matrix.

    But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
    Do you know how to simplify that?
     
  13. Aug 15, 2011 #12

    fluidistic

    User Avatar
    Gold Member

    Yeah I'd rather prove it, otherwise I feel like I'm assuming what I want to prove. :)
    Oh bright idea. I think I do know how to simplify it. [itex](Ax)^T(Ax)=x^TA^TAx=x^Tx=||x||^2[/itex]. Since [itex]||v|| \geq 0[/itex] for any vector v, we reach [itex]|Ax|=|x|[/itex]. I'm going to think on how to proceed further. Will post here as soon as I get results or get stuck.
    Thanks :)
     
  14. Aug 15, 2011 #13

    fluidistic

    User Avatar
    Gold Member

    What I get: Let x be an eigenvector associated with the eigenvalue lambda. [itex]Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1[/itex]. Thus if [itex]\lambda \in \mathbb{R}[/itex] as stated, then [itex]\lambda = 1[/itex] or [itex]-1[/itex].
     
  15. Aug 16, 2011 #14

    I like Serena

    User Avatar
    Homework Helper

    Good! :smile:

    (Although in a proper proof, you should mention that you are using the property of an orthogonal matrix that the norm of a vector is invariant.
    Of course, I already know in this case. :wink:)


    As an afterthought, you may want to distinguish the vector norm ||*|| from the absolute value |*| (when applied to lambda) here.
     
  16. Aug 18, 2011 #15

    fluidistic

    User Avatar
    Gold Member

    Thanks once again. I've been so busy I can't even believe it's been 3 (oh 4 this minute) days I've last written in this post.
    Yeah on my draft I've redone the exercise and made use of the norm of lambda*x and modulus of lambda, I think I did it well.
    Thanks for pointing this out though. :)
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook