# Linear algebra, orthogonal matrix proof

1. Aug 14, 2011

### fluidistic

1. The problem statement, all variables and given/known data
Demonstrate that the following propositions hold if A is an nxn real and orthogonal matrix:
1)If $\lambda$ is a real eigenvalue of A then $\lambda =1$ or $-1$.
2)If $\lambda$ is a complex eigenvalue of A, the conjugate of $\lambda$ is also an eigenvalue of A.

2. Relevant equations
For part 1) I used the fact that A orthogonal implies $A^{-1}=A^T$. Also that $\det A = \frac{1}{det A^{-1}}$, and that $\det A = \det A^T$. I didn't demonstrate the 2 latter relations, I just assumed them to be true.

3. The attempt at a solution
I've done part 1), I'm just lazy to write down the detailed proof. I made use of the relevant equations I've written.
However for 2) I'm totally stuck at even planting the problem. I started by writing $\lambda = a+ib$ and its conjugate $\overline \lambda = a-ib$ but this lead me nowhere and stuck immediately.
I think I've read in wikipedia yesterday that the eigenvalues of an orthogonal matrices all have modulus 1. If I remember well, A is diagonalizable and put under this form I should "see that all eigenvalues of A have modulus 1".
If this is the simpler approach, please let me know. I could demonstrate that A is diagonalizable first and then try to go further.
In the demonstration in 1), at one point I reach the fact that $\lambda A ^T =I=\lambda A$. So clearly A is... diagonal (this is not necessarily true so this looks like a mistake... damn it.)

My proof for 1):
I must show that $Ax=\lambda x \Rightarrow \lambda = \pm 1 \forall x \in \mathbb{R}^n$.
I multiply both sides by the inverse of A: $A^{-1}Ax= A ^{-1} \lambda x \Rightarrow x= \lambda A^{-1}x$.
$\Rightarrow \lambda A^{-1}=I \Rightarrow \det A^{-1}=\frac{1}{\lambda }$. But we also have that $\lambda A^T=I$ because A is orthogonal. Since A is a square matrix, $\det A =\det A^T$ too. Thus we have that $\det A =\det A^T \Rightarrow \det (\lambda A ^T )=1 \Rightarrow \det (\lambda A )=1 \Rightarrow \det A = \frac{1}{\lambda}$.
$\Rightarrow \det A = \det A^{-1}$. But for any invertible matrix A, $\det A = \frac{1}{\det A ^{-1}}$.
So that if $a = \det A$, then $a= \frac{1}{a} \Rightarrow a^2=1 \Rightarrow a = \pm 1$. And since $\lambda = \frac{1}{\det A}$, I reach $\lambda = \pm 1$.
Any tip/help will be appreciated. Thanks.

2. Aug 14, 2011

### I like Serena

Hi fluidistic!

You infer $x= \lambda A^{-1}x \Rightarrow \lambda A^{-1} = I$.
I'm afraid this is not generally true.
The implication would be that A is a diagonal matrix, which it obviously doesn't have to be.

As a hint for 1): what is |Ax|?

As a hint for 2): suppose $Av = \lambda v$, what is the conjugate of (Av)?

3. Aug 14, 2011

### upsidedowntop

$\lambda A^{-1}$ mapping all eigenvectors of A to themselves does not imply $\lambda A^{-1} = I.$

4. Aug 14, 2011

### fluidistic

Hi and thanks to both of you guys.
Oh I see. I really wasn't aware of this so I'm quite surprised. So basically I have a vector x in R^n and it's worth a matrix times exactly the same vector x. If this matrix isn't necessarily the identity, can you give me an example of such a matrix? Say a 2x2 matrix. I just tried myself to find such an example and failed (probably due to some algebra mistake, I don't see what I did wrong).

For 1), is it $|\lambda x|$?
For 2), here I'm confused. A is a real matrix, I think it means that all his entries are real. I realize that it can have complex eigenvalues though. Also, I thought that x was in R^n. It seems like the entries of x can be in C^n? Because otherwise I don't see how to reach $\lambda x$ with $\lambda$ being complex valued.
I think I need to make a nap, I'm extremely tired so I feel I'm not thinking as hard as I should right now; unlike when I started the problem.
Thanks for any further push.

5. Aug 14, 2011

### I like Serena

Try $\begin{pmatrix}0 & 1 \\ 1 & 0\end{pmatrix}$.

Noooo, I didn't say x was an eigenvector.
Try to use the properties of an orthogonal matrix.

Your problem doesn't say that x has to be real.
And anyway, if lambda is complex, the corresponding eigenvector has to be complex too (can you proof that?)
As you can see I avoided using x here.
I'm thinking of x as any real valued vector, and I'm thinking of v as a specific eigenvector, which may be complex valued.

6. Aug 14, 2011

### vela

Staff Emeritus
Such a vector is called an eigenvector of the matrix. In your attempt, I noticed you said Ax=λx for all x in Rn, but that's not correct. It only holds for certain vectors, the eigenvectors of A.
ILS wants you to use the definition of the norm of a vector and apply it to the vector Ax.

7. Aug 14, 2011

### fluidistic

Well I reach $\begin {bmatrix} x_1 \\ x_2 \end{bmatrix}=\begin {bmatrix} x_2 \\ x_1 \end{bmatrix}$. So unless I'm mistaken this matrix doesn't work.

Ah I see! If I remember well something I've read somewhere I don't remember, orthogonal matrices preserve lengths so that |Ax|=|x|.

I think I can prove it. I was having problems because I assumed x was in R^n.
If lambda is complex, A has real entries and x also has real entries then there's no way that a multiplication/sum of real numbers give a complex number. So that x has to be complex valued. (I'm having in mind the picture Ax=lambda x.)

You mean x any complex and real valued?

Oh thanks for pointing this out. I had the doubt for a moment and made an error with this. I assumed for some reason that having an infinity of eigenvectors couldn't be possible while of course it is. Only the direction matters, versus their lengths.

$||Ax||=\sqrt {<Ax,Ax>}$ where <,> denotes the inner product. Or do you mean a specific norm?

8. Aug 14, 2011

### vela

Staff Emeritus
But it will work for specific vectors
\begin{align*}
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ 1\end{pmatrix} &= (1)\begin{pmatrix} 1 \\ 1\end{pmatrix} \\
\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\begin{pmatrix}1 \\ -1\end{pmatrix} &= (-1)\begin{pmatrix} 1 \\ -1\end{pmatrix}
\end{align*}
If you demand that every vector x in Rn satisfies Ax=λx, then you're right that A is a multiple of the identity matrix.
I'm not sure what you're saying here. :)
Now using the fact that A is orthogonal, you can show |Ax|=|x|, which is what I think ILS was trying to get you to see. Now suppose x is an eigenvector of A.

9. Aug 15, 2011

### I like Serena

I see vela has already given the answers that you need to move on.

Yep! That's it. :)

Yep!

Whatever.

Yes, I meant this one.
When not specified, this one is always meant.

10. Aug 15, 2011

### fluidistic

Thanks once again guys!
Well I'm stuck at showing that $||Ax||=||x||$. I know I have to use the fact that $||Ax||=\sqrt {<Ax,Ax>}$ and that A is orthogonal ($A^T=A^{-1}$).
I'm thinking about using some properties of the inner product but I can't find any interesting for this case.

11. Aug 15, 2011

### I like Serena

Ah, well, I didn't really intend for you to proof it.
As far as I'm concerned it's simply a property of an orthogonal matrix.

But if you want to proof it, you can use: ||Ax||2 = (Ax)T(Ax).
Do you know how to simplify that?

12. Aug 15, 2011

### fluidistic

Yeah I'd rather prove it, otherwise I feel like I'm assuming what I want to prove. :)
Oh bright idea. I think I do know how to simplify it. $(Ax)^T(Ax)=x^TA^TAx=x^Tx=||x||^2$. Since $||v|| \geq 0$ for any vector v, we reach $|Ax|=|x|$. I'm going to think on how to proceed further. Will post here as soon as I get results or get stuck.
Thanks :)

13. Aug 15, 2011

### fluidistic

What I get: Let x be an eigenvector associated with the eigenvalue lambda. $Ax= \lambda x \Rightarrow |Ax|= |\lambda x | =|x| =|\lambda || x|\Rightarrow |\lambda |=1$. Thus if $\lambda \in \mathbb{R}$ as stated, then $\lambda = 1$ or $-1$.

14. Aug 16, 2011

### I like Serena

Good!

(Although in a proper proof, you should mention that you are using the property of an orthogonal matrix that the norm of a vector is invariant.
Of course, I already know in this case. )

As an afterthought, you may want to distinguish the vector norm ||*|| from the absolute value |*| (when applied to lambda) here.

15. Aug 18, 2011

### fluidistic

Thanks once again. I've been so busy I can't even believe it's been 3 (oh 4 this minute) days I've last written in this post.
Yeah on my draft I've redone the exercise and made use of the norm of lambda*x and modulus of lambda, I think I did it well.
Thanks for pointing this out though. :)