Undergrad Question about decomposition of matrices in SL(2,R)

Click For Summary
The discussion centers on the decomposition of a 2x2 matrix S in SL(2,R) into the form A^{-1}RA, where R is an orthogonal matrix and A is invertible. It is established that not all matrices in SL(2,R) are similar to orthogonal matrices, particularly those that are not diagonalizable over the reals. A counterexample is provided with the matrix S = (1 2; -1 -1), which can be decomposed as required, demonstrating that some matrices in SL(2,R) can indeed be similar to orthogonal matrices. The conversation also touches on the conditions under which such decompositions are possible, emphasizing the need for eigenvalues with modulus 1 and the alignment of eigenvectors' real parts. The uniqueness of matrix A is also discussed, noting that it can be modified by any invertible matrix that commutes with S.
mnb96
Messages
711
Reaction score
5
Hello,

we are given a 2×2 matrix S such that det(S)=1.
I would like to find a 2x2 invertible matrix A such that: A S A^{-1} = R, where R is an orthogonal matrix.

Note that the problem can be alternatively reformulated as: Is it possible to decompose a matrix SSL(2,ℝ) in the following way: S=A^{-1}R Awhere R is orthogonal and A is invertible?

Is this a well-known problem? To be honest, I don't have many ideas on how to tackle this problem, so even a suggestion that could get me on the right track would be very welcome.
 
Physics news on Phys.org
Every orthogonal matrix can be diagonalized. Not every matrix in ##SL(2,\mathbb{R})## can be diagonalized. So what you ask is impossible.
 
You may count the numbers of degrees of freedom you have in either matrix.
 
Why did you mention the diagonalizability of a matrix?
I am not asking if S is similar to a diagonal matrix. I am rather asking if S is similar to an orthogonal matrix.
 
##R## orthogonal ⇒ ##R## diagonalizable ⇒ (if ##R## similar to ##S##) ##S## diagonalizable
##S = \begin{pmatrix} 1 && 1 \\ 0 && 1 \end{pmatrix} \in SL(2,ℝ)## not diagonalizable, contradiction
⇒ ##S## not similar to ##R##
 
fresh_42 said:
##R## orthogonal ⇒ ##R## diagonalizable
I thought orthogonal matrices were not diagonalizable over ℝ in general.Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br />.
We can easily see that det(S)=1, and we can still decompose that matrix into: <br /> S = \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix} }_{A^{-1}}\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> 0 &amp; -1\\<br /> 1 &amp; 0<br /> \end{pmatrix}}_R\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}}_A<br />

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.
 
Last edited:
mnb96 said:
I thought orthogonal matrices were not diagonalizable over ℝ in general.

They are over ##\mathbb{C}##.

Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br />.
We can easily see that det(S)=1, and we can still decompose that matrix into: <br /> S = \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix} }_{A^{-1}}\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> 0 &amp; -1\\<br /> 1 &amp; 0<br /> \end{pmatrix}}_R\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}}_A<br />

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.

##S## is diagnalizable over ##\mathbb{C}##. And of course there exists matrices in ##SL(2,\mathbb{R})## similar to orthogonal matrices. The identity matrix would be an example of this. The point is that not all matrices in ##SL(2,\mathbb{R})## are similar to orthogonal matrices.
 
Furthermore, what you ask is possible exactly for those matrices in ##SL(2,\mathbb{R})## whose eigenvalues have ##|\lambda|=1## (and could be complex).
 
Thanks micromass,

I think I understand your remarks, but I still have a doubt.

Given a matrix \mathrm{S} \in SL(2,\mathbb{R}), and assuming there exists a 2×2 real matrix matrix A such that A^{-1}SA = R, we can deduce that S must be diagonalizable (over ℂ), since every rotation matrix R is. That means that we can write: (AC)^{-1}\,S\,(AC) = \Lambda where we used the diagonalization R=C\,\Lambda\,C^{-1} (note that both C and Λ are in general complex).Now, starting from the sole knowledge of S we could perform an eigendecomposition of S and we would obtain S=Q^{-1}\Lambda Q, where Q=AC. But then, how do we extract the real matrix A from the complex matrix Q?

For instance, in the example I gave above where <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br /> the eigendecomposition of S would give S=Q \Lambda Q^{-1} where: Q=<br /> \frac{\sqrt{6}}{6}\begin{pmatrix}<br /> 2 &amp; 2\\<br /> -1+i &amp; -1-i<br /> \end{pmatrix}<br />
\Lambda=<br /> \begin{pmatrix}<br /> i &amp; 0\\<br /> 0 &amp; -i<br /> \end{pmatrix}<br />

How do we extract A=<br /> \begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}<br /> from the complex matrix Q ?
 
Last edited:
  • #10
Ok, I think this question has been almost answered.

Summarizing, given a matrix S\in SL(2,\mathbb{R}) we ask if there exist one invertible matrix A\in GL(2,\mathbb{R}) and a rotation matrix R\in SO(2,\mathbb{R}) such that A^{-1}SA=R.

Since R is diagonalizable over ℂ (i.e. R=C\Lambda C^{-1}), we have that:

S=(AC)\Lambda (C^{-1}A^{-1}) = Q\Lambda Q^{-1} \qquad\qquad\qquad \mathrm{(1)}
From the above equation we can deduce that S must be diagonalizable and must have eigenvalues having modulo 1. These are necessary conditions. They would be also sufficient conditions if we add the requirement that the eigenvectors of S must have the same real part.

Since Q=AC we can find a matrix A=QC^{-1}, where the columns of C contain the eigenvectors of R, for instance C=\begin{pmatrix}<br /> 1 &amp; 1\\<br /> i &amp; -i<br /> \end{pmatrix}<br />.
It can be verified that when the eigenvectors of S have the same real part, then the matrix A is real.

However, the matrix A is not unique. For example, we can easily see that by multiplying A with an invertible matrix M that commutes with S (i.e. MS=SM) Equation (1) would still hold.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 17 ·
Replies
17
Views
6K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K