# Show that +1 is an eigenvalue of an odd-dimensional rotation matrix.

1. Dec 4, 2013

### Calabi_Yau

1. The problem statement, all variables and given/known data

The probelm is to show, that a rotation matrix R, in a odd-dimensional vector space, leaves unchanged the vectors of at least an one-dimensional subspace.

2. Relevant equations

This reduces to proving that 1 is an eigenvalue of Rnxn if n is odd. I know that a rotational matrix has determinant = 1 and that it is orthogonal and thus its inverse equals its transpose.

3. The attempt at a solution
I have considered det(R - I) = 0, as there is some v ≠ 0 such that (R - I)v = 0, thus R - I is singular and det(R - I) = 0. Now, what I'm having trouble dealing with, is proving that this only happens for odd-dimensional vector spaces. How do I "insert" the odd-dimension in this problem? Could you give me an hint? I feel like I'm really close but I can't seem to figure it out. :S

2. Dec 4, 2013

### brmath

Let M be your rotational matrix. You know it has determinant 1. How is the determinant related to the eigenvalues? And in general (never mind odd n) what are all the eigenvalues of an orthonormal matrix?

3. Dec 5, 2013

### HallsofIvy

Staff Emeritus
Do you know that the determinant of any matrix is the product of its eigenvalues? Do you know that the eigenvalues of an orthogonal matrix are all ones and negative ones? Do you know that the determinant of an othogonal matrix is 1?

From the last two, it follows that an orthogonal matrix must have an even number of negative eigenvalues.

4. Dec 5, 2013

### Calabi_Yau

The determinant being the product of the eigenvalues only applies if the matrix is invertible, I guess.
The absolute value of the real eigenvalues of a rotation matriz is always 1. But there are orthogonal matrices with complex eigenvalues. But I already got it. By showing that the matrix R has only eigenvalues of 1 or -1. Being its determinant = 1 and n odd, then its eigenvalues must be 1.

5. Dec 5, 2013

### jbunniii

No, this is true in general for square matrices. Indeed, 0 is an eigenvalue of the matrix if and only if the matrix is singular.

6. Dec 5, 2013

### vela

Staff Emeritus
Perhaps you misspoke, but your conclusion as written isn't true. For example,
$$R = \begin{pmatrix} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \end{pmatrix}$$ has eigenvalues that aren't 1.

7. Dec 5, 2013

### D H

Staff Emeritus
That's not true, Halls. Consider $\begin{pmatrix} \phantom{-}\cos\theta & \sin\theta \\ -\sin\theta & \cos\theta \end{pmatrix}$. The eigenvalues of this rather simple rotation matrix are $\cos\theta \pm i \sin\theta$. These are one or negative one iff $\sin\theta = 0$.

That is correct. What does the characteristic polynomial look like? What does that mean regarding those complex eigenvalues?

Presumably you are working with ℝn, which means all elements of the rotation matrix are real. If this is a rotation matrix for vectors in ℂn it is not longer necessarily true that one of the eigenvalues must be +1 for odd n.

You cannot show the matrix R has only eigenvalues of 1 and -1 because this is not true.

Here's a counterexample: $\begin{pmatrix} 1 & 0 & 0 \\ 0 & \phantom{-}\cos\theta & \sin\theta \\ 0 & -\sin\theta & \cos\theta \end{pmatrix}$. The eigenvalues are 1, cos(θ)+sin(θ)i, and cos(θ)-sin(θ)i.

Last edited: Dec 5, 2013
8. Dec 5, 2013

### brmath

This problem would be easier if all the eigenvalues were 1 or -1. Your suggestion to look at the characteristic polynomial is important.