Show that +1 is an eigenvalue of an odd-dimensional rotation matrix.

Calabi_Yau
Messages
35
Reaction score
1

Homework Statement



The probelm is to show, that a rotation matrix R, in a odd-dimensional vector space, leaves unchanged the vectors of at least an one-dimensional subspace.

Homework Equations



This reduces to proving that 1 is an eigenvalue of Rnxn if n is odd. I know that a rotational matrix has determinant = 1 and that it is orthogonal and thus its inverse equals its transpose.

The Attempt at a Solution


I have considered det(R - I) = 0, as there is some v ≠ 0 such that (R - I)v = 0, thus R - I is singular and det(R - I) = 0. Now, what I'm having trouble dealing with, is proving that this only happens for odd-dimensional vector spaces. How do I "insert" the odd-dimension in this problem? Could you give me an hint? I feel like I'm really close but I can't seem to figure it out. :S
 
Physics news on Phys.org
Let M be your rotational matrix. You know it has determinant 1. How is the determinant related to the eigenvalues? And in general (never mind odd n) what are all the eigenvalues of an orthonormal matrix?
 
Do you know that the determinant of any matrix is the product of its eigenvalues? Do you know that the eigenvalues of an orthogonal matrix are all ones and negative ones? Do you know that the determinant of an othogonal matrix is 1?

From the last two, it follows that an orthogonal matrix must have an even number of negative eigenvalues.
 
The determinant being the product of the eigenvalues only applies if the matrix is invertible, I guess.
The absolute value of the real eigenvalues of a rotation matriz is always 1. But there are orthogonal matrices with complex eigenvalues. But I already got it. By showing that the matrix R has only eigenvalues of 1 or -1. Being its determinant = 1 and n odd, then its eigenvalues must be 1.
 
Calabi_Yau said:
The determinant being the product of the eigenvalues only applies if the matrix is invertible, I guess.
No, this is true in general for square matrices. Indeed, 0 is an eigenvalue of the matrix if and only if the matrix is singular.
 
Calabi_Yau said:
By showing that the matrix R has only eigenvalues of 1 or -1. Being its determinant = 1 and n odd, then its eigenvalues must be 1.
Perhaps you misspoke, but your conclusion as written isn't true. For example,
$$R = \begin{pmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \end{pmatrix}$$ has eigenvalues that aren't 1.
 
HallsofIvy said:
Do you know that the eigenvalues of an orthogonal matrix are all ones and negative ones?
That's not true, Halls. Consider ##\begin{pmatrix} \phantom{-}\cos\theta & \sin\theta \\ -\sin\theta & \cos\theta \end{pmatrix}##. The eigenvalues of this rather simple rotation matrix are ##\cos\theta \pm i \sin\theta##. These are one or negative one iff ##\sin\theta = 0##.
Calabi_Yau said:
But there are orthogonal matrices with complex eigenvalues.
That is correct. What does the characteristic polynomial look like? What does that mean regarding those complex eigenvalues?

Presumably you are working with ℝn, which means all elements of the rotation matrix are real. If this is a rotation matrix for vectors in ℂn it is not longer necessarily true that one of the eigenvalues must be +1 for odd n.
Calabi_Yau said:
But I already got it. By showing that the matrix R has only eigenvalues of 1 or -1.
You cannot show the matrix R has only eigenvalues of 1 and -1 because this is not true.

Here's a counterexample: ##\begin{pmatrix} 1 & 0 & 0 \\ 0 & \phantom{-}\cos\theta & \sin\theta \\ 0 & -\sin\theta & \cos\theta \end{pmatrix}##. The eigenvalues are 1, cos(θ)+sin(θ)i, and cos(θ)-sin(θ)i.
 
Last edited:
D H said:
That's not true, Halls. Consider ##\begin{pmatrix} \phantom{-}\cos\theta & \sin\theta \\ -\sin\theta & \cos\theta \end{pmatrix}##. The eigenvalues of this rather simple rotation matrix are ##\cos\theta \pm i \sin\theta##. These are one or negative one iff ##\sin\theta = 0##.



That is correct. What does the characteristic polynomial look like? What does that mean regarding those complex eigenvalues?

Presumably you are working with ℝn, which means all elements of the rotation matrix are real. If this is a rotation matrix for vectors in ℂn it is not longer necessarily true that one of the eigenvalues must be +1 for odd n.



You cannot show the matrix R has only eigenvalues of 1 and -1 because this is not true.

Here's a counterexample: ##\begin{pmatrix} 1 & 0 & 0 \\ 0 & \phantom{-}\cos\theta & \sin\theta \\ 0 & -\sin\theta & \cos\theta \end{pmatrix}##. The eigenvalues are 1, cos(θ)+sin(θ)i, and cos(θ)-sin(θ)i.

This problem would be easier if all the eigenvalues were 1 or -1. Your suggestion to look at the characteristic polynomial is important.
 
Back
Top