Linear algebra, orthogonal matrix proof

Click For Summary
The discussion revolves around proving two propositions related to eigenvalues of an n x n orthogonal matrix A. The first proposition states that if λ is a real eigenvalue of A, then λ must be either 1 or -1, which was demonstrated by using properties of orthogonal matrices that preserve vector lengths. The second proposition involves complex eigenvalues, where it is noted that if λ is a complex eigenvalue, its conjugate must also be an eigenvalue of A. Participants discuss the implications of eigenvectors and the nature of orthogonal matrices, emphasizing the need to correctly apply definitions and properties in their proofs. The conversation highlights the importance of understanding the relationship between eigenvalues, eigenvectors, and matrix properties in linear algebra.
fluidistic
Gold Member
Messages
3,931
Reaction score
281

Homework Statement


Demonstrate that the following propositions hold if A is an nxn real and orthogonal matrix:
1)If \lambda is a real eigenvalue of A then \lambda =1 or -1.
2)If \lambda is a complex eigenvalue of A, the conjugate of \lambda is also an eigenvalue of A.


Homework Equations


For part 1) I used the fact that A orthogonal implies A^{-1}=A^T. Also that \det A = \frac{1}{det A^{-1}}, and that \det A = \det A^T. I didn't demonstrate the 2 latter relations, I just assumed them to be true.


The Attempt at a Solution


I've done part 1), I'm just lazy to write down the detailed proof. I made use of the relevant equations I've written.
However for 2) I'm totally stuck at even planting the problem. I started by writing \lambda = a+ib and its conjugate \overline \lambda = a-ib but this lead me nowhere and stuck immediately.
I think I've read in wikipedia yesterday that the eigenvalues of an orthogonal matrices all have modulus 1. If I remember well, A is diagonalizable and put under this form I should "see that all eigenvalues of A have modulus 1".
If this is the simpler approach, please let me know. I could demonstrate that A is diagonalizable first and then try to go further.
In the demonstration in 1), at one point I reach the fact that \lambda A ^T =I=\lambda A. So clearly A is... diagonal (this is not necessarily true so this looks like a mistake... damn it.)

My proof for 1):
I must show that Ax=\lambda x \Rightarrow \lambda = \pm 1 \forall x \in \mathbb{R}^n.
I multiply both sides by the inverse of A: A^{-1}Ax= A ^{-1} \lambda x \Rightarrow x= \lambda A^{-1}x.
\Rightarrow \lambda A^{-1}=I \Rightarrow \det A^{-1}=\frac{1}{\lambda }. But we also have that \lambda A^T=I because A is orthogonal. Since A is a square matrix, \det A =\det A^T too. Thus we have that \det A =\det A^T \Rightarrow \det (\lambda A ^T )=1 \Rightarrow \det (\lambda A )=1 \Rightarrow \det A = \frac{1}{\lambda}.
\Rightarrow \det A = \det A^{-1}. But for any invertible matrix A, \det A = \frac{1}{\det A ^{-1}}.
So that if a = \det A, then a= \frac{1}{a} \Rightarrow a^2=1 \Rightarrow a = \pm 1. And since \lambda = \frac{1}{\det A}, I reach \lambda = \pm 1.
Any tip/help will be appreciated. Thanks.
 
Physics news on Phys.org
Hi fluidistic! :smile:

Let's start with your proof for 1).

You infer x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I.
I'm afraid this is not generally true.
The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.

As a hint for 1): what is |Ax|?

As a hint for 2): suppose Av = \lambda v, what is the conjugate of (Av)?
 
x= \lambda A^{-1}x \Rightarrow \lambda A^{-1}=I.

\lambda A^{-1} mapping all eigenvectors of A to themselves does not imply \lambda A^{-1} = I.
 
Hi and thanks to both of you guys.
I like Serena said:
Hi fluidistic! :smile:

Let's start with your proof for 1).

You infer x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I.
I'm afraid this is not generally true.
The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.
Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

As a hint for 1): what is |Ax|?

As a hint for 2): suppose Av = \lambda v, what is the conjugate of (Av)?
For 1), is it |\lambda x|?
For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach \lambda x with \lambda being complex valued.
I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
Thanks for any further push.
 
fluidistic said:
Hi and thanks to both of you guys.

Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

Try \begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}.


fluidistic said:
For 1), is it |\lambda x|?

Noooo, I didn't say x was an eigenvector. :rolleyes:
Try to use the properties of an orthogonal matrix.


fluidistic said:
For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach \lambda x with \lambda being complex valued.
I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
Thanks for any further push.

Your problem doesn't say that x has to be real.
And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
As you can see I avoided using x here.
I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.
 
fluidistic said:
Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).
Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
For 1), is it |\lambda x|?
ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.
 
I like Serena said:
Try \begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}.
Well I reach \begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}. So unless I'm mistaken this matrix doesn't work.


Noooo, I didn't say x was an eigenvector. :rolleyes:
Try to use the properties of an orthogonal matrix.
Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.




Your problem doesn't say that x has to be real.
And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
I think I can prove it. I was having problems because I assumed x was in R^n.
If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

As you can see I avoided using x here.
I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.
You mean x any complex and real valued?

vela said:
Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.

ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.
||Ax||=\sqrt {&lt;Ax,Ax&gt;} where <,> denotes the inner product. Or do you mean a specific norm?
 
fluidistic said:
Well I reach \begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}. So unless I'm mistaken this matrix doesn't work.
But it will work for specific vectors
\begin{align*}
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ 1\end{pmatrix} &= (1)\begin{pmatrix} 1 \\ 1\end{pmatrix} \\
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ -1\end{pmatrix} &= (-1)\begin{pmatrix} 1 \\ -1\end{pmatrix}
\end{align*}
If you demand that every vector x in Rn satisfies Ax=λx, then you're right that A is a multiple of the identity matrix.
Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.
I'm not sure what you're saying here. :)
||Ax||=\sqrt {&lt;Ax,Ax&gt;} where <,> denotes the inner product. Or do you mean a specific norm?
Now using the fact that A is orthogonal, you can show |Ax|=|x|, which is what I think ILS was trying to get you to see. Now suppose x is an eigenvector of A.
 
I see vela has already given the answers that you need to move on.

I'll just give a couple of additional comments. :smile:


fluidistic said:
Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.

Yep! That's it. :)

fluidistic said:
I think I can prove it. I was having problems because I assumed x was in R^n.
If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

Yep!

fluidistic said:
You mean x any complex and real valued?

Whatever.

fluidistic said:
||Ax||=\sqrt {&lt;Ax,Ax&gt;} where <,> denotes the inner product. Or do you mean a specific norm?

Yes, I meant this one.
When not specified, this one is always meant.
 
  • #10
Thanks once again guys!
Well I'm stuck at showing that ||Ax||=||x||. I know I have to use the fact that ||Ax||=\sqrt {&lt;Ax,Ax&gt;} and that A is orthogonal (A^T=A^{-1}).
I'm thinking about using some properties of the inner product but I can't find any interesting for this case.
 
  • #11
fluidistic said:
Thanks once again guys!
Well I'm stuck at showing that ||Ax||=||x||. I know I have to use the fact that ||Ax||=\sqrt {&lt;Ax,Ax&gt;} and that A is orthogonal (A^T=A^{-1}).
I'm thinking about using some properties of the inner product but I can't find any interesting for this case.

Ah, well, I didn't really intend for you to proof it.
As far as I'm concerned it's simply a property of an orthogonal matrix.

But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
Do you know how to simplify that?
 
  • #12
I like Serena said:
Ah, well, I didn't really intend for you to proof it.
As far as I'm concerned it's simply a property of an orthogonal matrix.

But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
Do you know how to simplify that?

Yeah I'd rather prove it, otherwise I feel like I'm assuming what I want to prove. :)
Oh bright idea. I think I do know how to simplify it. (Ax)^T(Ax)=x^TA^TAx=x^Tx=||x||^2. Since ||v|| \geq 0 for any vector v, we reach |Ax|=|x|. I'm going to think on how to proceed further. Will post here as soon as I get results or get stuck.
Thanks :)
 
  • #13
What I get: Let x be an eigenvector associated with the eigenvalue lambda. Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1. Thus if \lambda \in \mathbb{R} as stated, then \lambda = 1 or -1.
 
  • #14
fluidistic said:
What I get: Let x be an eigenvector associated with the eigenvalue lambda. Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1. Thus if \lambda \in \mathbb{R} as stated, then \lambda = 1 or -1.

Good! :smile:

(Although in a proper proof, you should mention that you are using the property of an orthogonal matrix that the norm of a vector is invariant.
Of course, I already know in this case. :wink:)

As an afterthought, you may want to distinguish the vector norm ||*|| from the absolute value |*| (when applied to lambda) here.
 
  • #15
Thanks once again. I've been so busy I can't even believe it's been 3 (oh 4 this minute) days I've last written in this post.
Yeah on my draft I've redone the exercise and made use of the norm of lambda*x and modulus of lambda, I think I did it well.
Thanks for pointing this out though. :)
 

Similar threads

  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 25 ·
Replies
25
Views
3K
Replies
6
Views
2K
Replies
2
Views
2K
Replies
19
Views
1K
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K