# I Question about decomposition of matrices in SL(2,R)

1. Jul 5, 2016

### mnb96

Hello,

we are given a 2×2 matrix $S$ such that $det(S)=1$.
I would like to find a 2x2 invertible matrix $A$ such that: $A S A^{-1} = R$, where $R$ is an orthogonal matrix.

Note that the problem can be alternatively reformulated as: Is it possible to decompose a matrix SSL(2,ℝ) in the following way: $$S=A^{-1}R A$$where R is orthogonal and A is invertible?

Is this a well-known problem? To be honest, I don't have many ideas on how to tackle this problem, so even a suggestion that could get me on the right track would be very welcome.

2. Jul 5, 2016

### micromass

Staff Emeritus
Every orthogonal matrix can be diagonalized. Not every matrix in $SL(2,\mathbb{R})$ can be diagonalized. So what you ask is impossible.

3. Jul 5, 2016

### Staff: Mentor

You may count the numbers of degrees of freedom you have in either matrix.

4. Jul 5, 2016

### mnb96

Why did you mention the diagonalizability of a matrix?
I am not asking if S is similar to a diagonal matrix. I am rather asking if S is similar to an orthogonal matrix.

5. Jul 5, 2016

### Staff: Mentor

$R$ orthogonal ⇒ $R$ diagonalizable ⇒ (if $R$ similar to $S$) $S$ diagonalizable
$S = \begin{pmatrix} 1 && 1 \\ 0 && 1 \end{pmatrix} \in SL(2,ℝ)$ not diagonalizable, contradiction
⇒ $S$ not similar to $R$

6. Jul 6, 2016

### mnb96

I thought orthogonal matrices were not diagonalizable over ℝ in general.

Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix $S = \begin{pmatrix} 1 & 2\\ -1 & -1 \end{pmatrix}$.
We can easily see that $det(S)=1$, and we can still decompose that matrix into: $$S = \underbrace{\begin{pmatrix} -1 & -1\\ 0 & 1 \end{pmatrix} }_{A^{-1}}\; \underbrace{\begin{pmatrix} 0 & -1\\ 1 & 0 \end{pmatrix}}_R\; \underbrace{\begin{pmatrix} -1 & -1\\ 0 & 1 \end{pmatrix}}_A$$

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.

Last edited: Jul 6, 2016
7. Jul 6, 2016

### micromass

Staff Emeritus
They are over $\mathbb{C}$.

$S$ is diagnalizable over $\mathbb{C}$. And of course there exists matrices in $SL(2,\mathbb{R})$ similar to orthogonal matrices. The identity matrix would be an example of this. The point is that not all matrices in $SL(2,\mathbb{R})$ are similar to orthogonal matrices.

8. Jul 6, 2016

### micromass

Staff Emeritus
Furthermore, what you ask is possible exactly for those matrices in $SL(2,\mathbb{R})$ whose eigenvalues have $|\lambda|=1$ (and could be complex).

9. Jul 7, 2016

### mnb96

Thanks micromass,

I think I understand your remarks, but I still have a doubt.

Given a matrix $\mathrm{S} \in SL(2,\mathbb{R})$, and assuming there exists a 2×2 real matrix matrix A such that $A^{-1}SA = R$, we can deduce that S must be diagonalizable (over ℂ), since every rotation matrix R is. That means that we can write: $$(AC)^{-1}\,S\,(AC) = \Lambda$$ where we used the diagonalization $R=C\,\Lambda\,C^{-1}$ (note that both C and Λ are in general complex).

Now, starting from the sole knowledge of S we could perform an eigendecomposition of S and we would obtain $S=Q^{-1}\Lambda Q$, where $Q=AC$. But then, how do we extract the real matrix A from the complex matrix Q?

For instance, in the example I gave above where $S = \begin{pmatrix} 1 & 2\\ -1 & -1 \end{pmatrix}$ the eigendecomposition of S would give $S=Q \Lambda Q^{-1}$ where: $$Q= \frac{\sqrt{6}}{6}\begin{pmatrix} 2 & 2\\ -1+i & -1-i \end{pmatrix}$$
$$\Lambda= \begin{pmatrix} i & 0\\ 0 & -i \end{pmatrix}$$

How do we extract $A= \begin{pmatrix} -1 & -1\\ 0 & 1 \end{pmatrix}$ from the complex matrix Q ?

Last edited: Jul 7, 2016
10. Jul 12, 2016

### mnb96

Ok, I think this question has been almost answered.

Summarizing, given a matrix $S\in SL(2,\mathbb{R})$ we ask if there exist one invertible matrix $A\in GL(2,\mathbb{R})$ and a rotation matrix $R\in SO(2,\mathbb{R})$ such that $A^{-1}SA=R$.

Since $R$ is diagonalizable over ℂ (i.e. $R=C\Lambda C^{-1}$), we have that:

$$S=(AC)\Lambda (C^{-1}A^{-1}) = Q\Lambda Q^{-1} \qquad\qquad\qquad \mathrm{(1)}$$
From the above equation we can deduce that $S$ must be diagonalizable and must have eigenvalues having modulo 1. These are necessary conditions. They would be also sufficient conditions if we add the requirement that the eigenvectors of $S$ must have the same real part.

Since $Q=AC$ we can find a matrix $A=QC^{-1}$, where the columns of $C$ contain the eigenvectors of $R$, for instance $C=\begin{pmatrix} 1 & 1\\ i & -i \end{pmatrix}$.
It can be verified that when the eigenvectors of $S$ have the same real part, then the matrix $A$ is real.

However, the matrix $A$ is not unique. For example, we can easily see that by multiplying $A$ with an invertible matrix $M$ that commutes with $S$ (i.e. $MS=SM$) Equation (1) would still hold.