Question about decomposition of matrices in SL(2,R)

  • Context: Undergrad 
  • Thread starter Thread starter mnb96
  • Start date Start date
  • Tags Tags
    Decomposition Matrices
Click For Summary

Discussion Overview

The discussion revolves around the decomposition of a 2x2 matrix \( S \) in the special linear group \( SL(2, \mathbb{R}) \) into the form \( A^{-1} R A \), where \( R \) is an orthogonal matrix and \( A \) is an invertible matrix. The participants explore the conditions under which such a decomposition is possible, touching on concepts of diagonalizability and eigenvalues.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant questions whether it is possible to decompose \( S \) into the form \( A^{-1} R A \) where \( R \) is orthogonal, suggesting a lack of ideas on how to approach the problem.
  • Another participant asserts that since not every matrix in \( SL(2, \mathbb{R}) \) can be diagonalized, the proposed decomposition is impossible.
  • Some participants argue that the diagonalizability of \( R \) implies that if \( R \) is similar to \( S \), then \( S \) must also be diagonalizable, leading to a contradiction with certain examples of \( S \) that are not diagonalizable.
  • A counterexample is provided where a specific matrix \( S \) is shown to be decomposable into the required form, challenging the earlier claims about the impossibility of such decompositions.
  • There is a discussion about the conditions under which matrices in \( SL(2, \mathbb{R}) \) can be similar to orthogonal matrices, with some participants noting that matrices with eigenvalues of modulus 1 may satisfy the conditions for decomposition.
  • One participant raises a question about extracting the real matrix \( A \) from a complex eigendecomposition of \( S \), indicating a need for clarity on this process.
  • Another participant summarizes the discussion, stating that necessary conditions for the decomposition include diagonalizability and eigenvalues with modulus 1, while also noting that the matrix \( A \) is not unique.

Areas of Agreement / Disagreement

Participants express differing views on the possibility of decomposing matrices in \( SL(2, \mathbb{R}) \) into the form involving orthogonal matrices. Some assert that it is impossible under certain conditions, while others provide counterexamples and argue that it is indeed possible for specific cases. The discussion remains unresolved regarding the general applicability of the decomposition.

Contextual Notes

There are limitations regarding the assumptions about diagonalizability and the nature of eigenvalues that are not fully resolved. The discussion also highlights the complexity of extracting real matrices from complex eigendecompositions, which remains a point of uncertainty.

mnb96
Messages
711
Reaction score
5
Hello,

we are given a 2×2 matrix S such that det(S)=1.
I would like to find a 2x2 invertible matrix A such that: A S A^{-1} = R, where R is an orthogonal matrix.

Note that the problem can be alternatively reformulated as: Is it possible to decompose a matrix SSL(2,ℝ) in the following way: S=A^{-1}R Awhere R is orthogonal and A is invertible?

Is this a well-known problem? To be honest, I don't have many ideas on how to tackle this problem, so even a suggestion that could get me on the right track would be very welcome.
 
Physics news on Phys.org
Every orthogonal matrix can be diagonalized. Not every matrix in ##SL(2,\mathbb{R})## can be diagonalized. So what you ask is impossible.
 
You may count the numbers of degrees of freedom you have in either matrix.
 
Why did you mention the diagonalizability of a matrix?
I am not asking if S is similar to a diagonal matrix. I am rather asking if S is similar to an orthogonal matrix.
 
##R## orthogonal ⇒ ##R## diagonalizable ⇒ (if ##R## similar to ##S##) ##S## diagonalizable
##S = \begin{pmatrix} 1 && 1 \\ 0 && 1 \end{pmatrix} \in SL(2,ℝ)## not diagonalizable, contradiction
⇒ ##S## not similar to ##R##
 
fresh_42 said:
##R## orthogonal ⇒ ##R## diagonalizable
I thought orthogonal matrices were not diagonalizable over ℝ in general.Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br />.
We can easily see that det(S)=1, and we can still decompose that matrix into: <br /> S = \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix} }_{A^{-1}}\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> 0 &amp; -1\\<br /> 1 &amp; 0<br /> \end{pmatrix}}_R\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}}_A<br />

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.
 
Last edited:
mnb96 said:
I thought orthogonal matrices were not diagonalizable over ℝ in general.

They are over ##\mathbb{C}##.

Btw, I actually found a counterexample of your (micromass' and fresh's) above statements: consider the matrix <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br />.
We can easily see that det(S)=1, and we can still decompose that matrix into: <br /> S = \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix} }_{A^{-1}}\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> 0 &amp; -1\\<br /> 1 &amp; 0<br /> \end{pmatrix}}_R\;<br /> <br /> \underbrace{\begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}}_A<br />

where A is invertible and R is orthogonal, as required. This proves that there exist matrices in SL(2,ℝ) that are similar to orthogonal matrices. Note also that S is not diagonalizable over ℝ.

##S## is diagnalizable over ##\mathbb{C}##. And of course there exists matrices in ##SL(2,\mathbb{R})## similar to orthogonal matrices. The identity matrix would be an example of this. The point is that not all matrices in ##SL(2,\mathbb{R})## are similar to orthogonal matrices.
 
Furthermore, what you ask is possible exactly for those matrices in ##SL(2,\mathbb{R})## whose eigenvalues have ##|\lambda|=1## (and could be complex).
 
Thanks micromass,

I think I understand your remarks, but I still have a doubt.

Given a matrix \mathrm{S} \in SL(2,\mathbb{R}), and assuming there exists a 2×2 real matrix matrix A such that A^{-1}SA = R, we can deduce that S must be diagonalizable (over ℂ), since every rotation matrix R is. That means that we can write: (AC)^{-1}\,S\,(AC) = \Lambda where we used the diagonalization R=C\,\Lambda\,C^{-1} (note that both C and Λ are in general complex).Now, starting from the sole knowledge of S we could perform an eigendecomposition of S and we would obtain S=Q^{-1}\Lambda Q, where Q=AC. But then, how do we extract the real matrix A from the complex matrix Q?

For instance, in the example I gave above where <br /> S = \begin{pmatrix}<br /> 1 &amp; 2\\<br /> -1 &amp; -1<br /> \end{pmatrix}<br /> the eigendecomposition of S would give S=Q \Lambda Q^{-1} where: Q=<br /> \frac{\sqrt{6}}{6}\begin{pmatrix}<br /> 2 &amp; 2\\<br /> -1+i &amp; -1-i<br /> \end{pmatrix}<br />
\Lambda=<br /> \begin{pmatrix}<br /> i &amp; 0\\<br /> 0 &amp; -i<br /> \end{pmatrix}<br />

How do we extract A=<br /> \begin{pmatrix}<br /> -1 &amp; -1\\<br /> 0 &amp; 1<br /> \end{pmatrix}<br /> from the complex matrix Q ?
 
Last edited:
  • #10
Ok, I think this question has been almost answered.

Summarizing, given a matrix S\in SL(2,\mathbb{R}) we ask if there exist one invertible matrix A\in GL(2,\mathbb{R}) and a rotation matrix R\in SO(2,\mathbb{R}) such that A^{-1}SA=R.

Since R is diagonalizable over ℂ (i.e. R=C\Lambda C^{-1}), we have that:

S=(AC)\Lambda (C^{-1}A^{-1}) = Q\Lambda Q^{-1} \qquad\qquad\qquad \mathrm{(1)}
From the above equation we can deduce that S must be diagonalizable and must have eigenvalues having modulo 1. These are necessary conditions. They would be also sufficient conditions if we add the requirement that the eigenvectors of S must have the same real part.

Since Q=AC we can find a matrix A=QC^{-1}, where the columns of C contain the eigenvectors of R, for instance C=\begin{pmatrix}<br /> 1 &amp; 1\\<br /> i &amp; -i<br /> \end{pmatrix}<br />.
It can be verified that when the eigenvectors of S have the same real part, then the matrix A is real.

However, the matrix A is not unique. For example, we can easily see that by multiplying A with an invertible matrix M that commutes with S (i.e. MS=SM) Equation (1) would still hold.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 17 ·
Replies
17
Views
7K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K