# Transforming one matrix base to another

## Homework Statement

The SO(3) representation can be represented as ##3\times 3## matrices with the following form:

$$J_1=\frac{1}{\sqrt{2}}\left(\matrix{0&1&0\\1&0&1\\ 0&1&0}\right) \ \ ; \ \ J_2=\frac{1}{\sqrt{2}}\left(\matrix{0&-i&0\\i&0&-i\\ 0&i&0}\right) \ \ ; \ \ J_3=\left(\matrix{1&0&0\\0&0&0\\ 0&0&-1}\right)$$

On the other hand, the generators of rotations of SO(3) can also be expressed as ##3\times 3## matrices with the following form:

$$K_1=\left(\matrix{0&0&0\\0&0&-i\\ 0&i&0}\right) \ \ ; \ \ K_2=\left(\matrix{0&0&i\\0&0&0\\ -i&0&0}\right) \ \ ; \ \ K_3=\left(\matrix{0&-i&0\\i&0&0\\ 0&0&0}\right)$$

Since the representation for SO(3) is unique, these matrices should be the same but "disguised". In other words, there exists a similarity matrix ##M## that allow us to transform from the basis of ##J_i## to ##K_i##. Determine the explicit form of the matrix ##M##.

Hint: ##M## must be unitary to conserve hermicity.

## Homework Equations

Similarity transformation:

$$X'=M^{-1}XM$$

## The Attempt at a Solution

Given a basis of vectors ##v=(v_1,v_2,...,v_n)##, it is easy to transform it to another basis of vectors ##w=(w_1,w_2,..,w_n)## by finding an appropiate similarity transformation matrix ##M##. However, when I have a basis of matrices ##\{J\}##, I can't find a way to transform to another basis of matrices ##\{K\}##.

My first attempt was to find an ##M## such that ##K_i=M^{-1}J_iM##, I tried expressing the ##J_i## as a linear combination of the ##K_i##.

However, except for ##J_2=\frac{1}{\sqrt{2}}(K_1+K_2)##, we cannot do the same for the others (ex. there's no way to express ##J_3## in terms of the ##K_i##).

My second attempt was to try finding transformation matrices for each case, and so far I only have found 3 different matrices to transform from ##J_i## to ##K_i## like this:

$$M_1=\left(\matrix{-\frac{1}{2}&0&\frac{1}{2}\\0&-\frac{i}{2}&0\\ \frac{1}{2}&0&\frac{1}{2}}\right) \ \ \ so \ that \ K_1=(M_1)^{-1}J_1 M_1$$

$$M_2=\left(\matrix{0&-\frac{1}{\sqrt{2}}&0\\ \frac{1}{2}&0&\frac{1}{2}\\ -\frac{1}{2}&0&\frac{1}{2}}\right) \ \ \ so \ that \ K_2=(M_2)^{-1}J_2 M_2$$

$$M_3=\left(\matrix{-i&0&i\\1&0&1\\ 0&1&0}\right) \ \ \ so\ that\ K_3=(M_3)^{-1}J_3 M_3$$

However, the statement of the problem says that the matrix ##M## should be unique, and I can't find a way to combine the above matrices (I've tried every possible permutation over the last week).

Related Advanced Physics Homework Help News on Phys.org
StoneTemplePython
Gold Member
I didn't follow what you are doing here. Your hint suggests that ##M_i## is unitary, which means, among other things, that each column has length (2 norm) of 1... but your ##M_1## and ##M_2## each have lengths of ##\lt 1## in their middle columns?

- - - -
All ##J_i## and ##K_i## are Hermitian hence we know they are diagonalizable. This means they can be written as similar to the same diagonal matrix. Hence you have an equality between them with ##D_i## as the bridge in between. You can use this equality with some algebraic manipulation to get the desired similarity transform...

Thank you for your reply. I see, I didn't known that one of the conditions of unitarity was that each column had a length of 1. So indeed, the ##M_1## and ##M_2## are not unitary.

From what you're saying, since all those matrices are diagonalizable, then I could write the following equalities:

$$Dj_{1}^{-1}J_1 Dj_1=Dj_{2}^{-1}J_2 Dj_2=Dj_{3}^{-1}J_3 Dj_3=Dk_{1}^{-1}K_1 Dk_1=Dk_{2}^{-1}K_2 Dk_2=Dk_{3}^{-1}K_3 Dk_3$$

Last edited:
StoneTemplePython
Gold Member
Thank you for your reply. I see, I didn't known that one of the conditions of unitarity was that each column had a length of 1. So indeed, the ##M_1## and ##M_2## are not unitary.
Can you tell me what your working definition of unitary is? It's intimately tied in with the idea of orthonormal vectors, so I'm not understanding this comment...

From what you're saying, since all those matrices are diagonalizable, then I could write the following equalities:

$$Dj_{1}^{-1}J_1 Dj_1=Dj_{2}^{-1}J_2 Dj_2=Dj_{3}^{-1}J_3 Dj_3=Dk_{1}^{-1}K_1 Dk_1=Dk_{2}^{-1}K_2 Dk_2=Dk_{3}^{-1}K_3 Dk_3$$
I don't think I said this...

start with ##K_1## and ##J_1##. What does it mean for them to be similar to the same diagonal matrix?

Can you tell me what your working definition of unitary is? It's intimately tied in with the idea of orthonormal vectors, so I'm not understanding this comment...
My definition of unitary matrix is any matrix ##U## whose conjugate transpose is equal to its inverse:

$$U U^{\dagger}=U^{\dagger} U=1$$.

As a consequence, U is normal, diagonalizable, and has a determinant equal to 1.

I don't think I said this...

start with ##K_1## and ##J_1##. What does it mean for them to be similar to the same diagonal matrix?
Sorry, I probably didn't understand at the beginning. For ##K_1## and ##J_1## to be similar, then there must exist a single invertible matrix, say ##P##, that simultaneously diagonalizes both:

$$P^{-1}J_1 P=Dj_1 \ \ \ \ \ \ \ P^{-1}K_1 P=Dk_1$$

StoneTemplePython
Gold Member
My definition of unitary matrix is any matrix ##U## whose conjugate transpose is equal to its inverse:

$$U U^{\dagger}=U^{\dagger} U=1$$.

As a consequence, U is normal, diagonalizable, and has a determinant equal to 1.
No. The determinant has a magnitude equal to 1 but its value may be anywhere on the unit circle. If you do blocked (by vector) multiplication of ##\mathbf U^* \mathbf U = \mathbf I## then it should be obvious that it implies the 2 norm of each vector is 1.... this is an exercise you need to do.

Sorry, I probably didn't understand at the beginning. For ##K_1## and ##J_1## to be similar, then there must exist a single invertible matrix, say ##P##, that simultaneously diagonalizes both:

$$P^{-1}J_1 P=P^{-1}K_1 P=D$$
I'm not sure what this means / it seems like a non sequitor. Simultaneous diagonalizability follows, but it is a more advanced /complicated concept.
- - - -
edit:
for avoidance of doubt these are very different concepts so I should not have said one 'follows'. Sticking in the nice world of normal matrices where everything is diagonalizable,

similar matrices = they have the same eigenvalues (with same multiplicities). I would say that these are algebraic properties (though there is information related to the minimal polynomial here that I won't go into). Since the similarity transforms are unitary, they preserve 'length' and this tells you that the matrices have the same singular values.

simultaneously diagonalizable in general doesn't tell you anything about the eigenvalues and tells you a lot about the eigenvectors (or if you prefer: commutativity). These are geometric properties.

- - - -

I'll repeat:

start with ##K_1## and ##J_1##. What does it mean for them to be similar to the same diagonal matrix?

i.e. write it out via individual similarity transform, not simultaneous diagonalizability...

Last edited:
No. The determinant has a magnitude equal to 1 but its value may be anywhere on the unit circle. If you do blocked (by vector) multiplication of ##\mathbf U^* \mathbf U = \mathbf I## then it should be obvious that it implies the 2 norm of each vector is 1.... this is an exercise you need to do.
You're right, I forgot that the determinant could be a complex number with magnitude equal to 1. Moreover, I see now where the 2 norm comes, I'll work on the demonstration to convince myself.

I'm not sure what this means / it seems like a non sequitor. Simultaneous diagonalizability follows, but it is a more advanced /complicated concept. I'll repeat:

start with ##K_1## and ##J_1##. What does it mean for them to be similar to the same diagonal matrix?

i.e. write it out via individual similarity transform, not simultaneous diagonalizability...
It seems I confused simultaneously diagonalizable matrices (which happens if both matrices commute), and similarity transformations. However, for them to be similar to the same diagonal matrix implies that there exist transformations ##P_i## for, let's say, ##J_1## and ##K_1## such as:

$$Pj_{1}^{-1}J_1Pj_{1}=D_1=Pk_{1}^{-1}K_1Pk_{1}$$

Edit: In this case, I can do a bit of algebra by multiplying on both sides and get:

$$M_{1}^{-1} J_1 M_{1}=K_{1} \ \ \ \ \ \ where: \ \ M_{1}=Pj_{1}Pk_{1}^{-1}$$

StoneTemplePython
Gold Member
Edit: In this case, I can do a bit of algebra by multiplying on both sides and get:

$$M_{1}^{-1} J_1 M_{1}=K_{1} \ \ \ \ \ \ where: \ \ M_{1}=Pj_{1}Pk_{1}^{-1}$$
this is probably right but very hard to read... writing it with actual subscripts as

##M_{1}=P_{j_1}P_{k_1}^{-1}##

is a lot easier to read.

If it were me I wouldn't use ##P## for both... it seems too much like they have the same eigenvectors. I'd do use ##U## for one of them so

##M_{1}=U_{j_1}P_{k_1}^{-1} =U_{j_1}P_{k_1}^{*}##

Indeed, I could improve a bit in notation. On the other hand, given that's right, it would just be a matter of repeating the process for the rest of the matrices, given that they all should have similarity transformations with the same idea.

Actually, by doing that I computed exactly the same matrices ##M_1, M_2, M_3## as the ones of the original post (just verified in Mathematica).

$$M_{1}^{-1} J_1 M_{1}=K_{1} \ \ \ \ ; \ \ \ \ \ \ \ \ M_1=U_{j1}P_{k1}^{-1}$$

$$M_{2}^{-1} J_3 M_{2}=K_{2}\ \ \ \ ; \ \ \ \ \ \ \ \ M_2=U_{j2}P_{k2}^{-1}$$

$$M_{3}^{-1} J_3 M_{3}=K_{3}\ \ \ \ ; \ \ \ \ \ \ \ \ M_3=U_{j3}P_{k3}^{-1}$$

Maybe I could now find a way to construct a single, unitary matrix ##M## that transforms all the matrices ##J_i## into ##K_i##.

StoneTemplePython
Gold Member
Indeed, I could improve a bit in notation. On the other hand, given that's right, it would just be a matter of repeating the process for the rest of the matrices, given that they all should have similarity transformations with the same idea.

Actually, by doing that I got exactly the same matrices ##M_1, M_2, M_3## as the ones of the original post.
The ##M_1##, ##M_2## and ##M_3## in your original post are not unitary. But the product of two unitary matrices is a unitary matrix. This is a contradiction which means you are doing something wrong.

Sometimes going repeating the process and cleaning up notation is the only way to flush out the mistakes.

The ##M_1##, ##M_2## and ##M_3## in your original post are not unitary. But the product of two unitary matrices is a unitary matrix. This is a contradiction which means you are doing something wrong.

Sometimes going repeating the process and cleaning up notation is the only way to flush out the mistakes.
Thank you. Then my problem is basically in how I'm computing the matrices ##M_i##, so I'll work again from scratch to find any potential mistakes I'm missing. I appreciate your help for today, I got a better understand of this aspect of linear algebra.

StoneTemplePython
Gold Member
Thank you. Then my problem is basically in how I'm computing the matrices ##M_i##, so I'll work again from scratch to find any potential mistakes I'm missing. I appreciate your help for today, I got a better understand of this aspect of linear algebra.
The issue is that I can't see how you're actually coming up with these things (though my sense is you're using built-in commands with Mathematica without special handling that is needed), but I'll take a shot in the dark: try normalizing each column to have length one. So instead of

##
M_1=\left(\matrix{-\frac{1}{2}&0&\frac{1}{2}\\0&-\frac{i}{2}&0\\ \frac{1}{2}&0&\frac{1}{2}}\right) ##

do

##
M_1:=\left(\matrix{-\frac{1}{2}&0&\frac{1}{2}\\0&-\frac{i}{2}&0\\ \frac{1}{2}&0&\frac{1}{2}}\right)
\left(\begin{matrix}\sqrt{2} & 0 & 0\\0 & 2 & 0\\0 & 0 & \sqrt{2}\end{matrix}\right)
##

Regarding how I computed the transformation matrices that diagonalized for each case, I actually followed the standard diagonalization procedure:

1) Compute the eigenvectors of the matrix, i.e., for##J_1##.
2) With those eigenvectors, build a matrix whose columns are the eigenvectors obtained (following our notation, this would be ##U_{j1}##. Find the inverse of this matrix.
3) The diagonal matrix is given by: ##D_1=U_{j1}^{-1} J_1 U_{j1}##

I used Mathematica to find the eigenvectors and to compute the inverse. The rest I constructed it manually.

What maybe I'm not doing is checking the adequate normalization for the eigenvectors before building the transformation matrix. I'll try doing what you suggested and normalize each column to have length one for the two matrices that need it.

StoneTemplePython
Gold Member
Regarding how I computed the transformation matrices that diagonalized for each case, I actually followed the standard diagonalization procedure:

1) Compute the eigenvectors of the matrix, i.e., for##J_1##.
2) With those eigenvectors, build a matrix whose columns are the eigenvectors obtained (following our notation, this would be ##U_{j1}##. Find the inverse of this matrix.
3) The diagonal matrix is given by: ##D_1=U_{j1}^{-1} J_1 U_{j1}##

I used Mathematica to find the eigenvectors and to compute the inverse. The rest I constructed it manually.

What maybe I'm not doing is checking the adequate normalization for the eigenvectors before building the transformation matrix. I'll try doing what you suggested and normalize each column to have length one for the two matrices that need it.
The issue is your eigenvector equation is, for some ##\mathbf x \neq \mathbf 0##

##A \mathbf x = \lambda \mathbf x##
but you can re-run it with ##\frac{1}{2}\mathbf x##
so

##A \big(\frac{1}{2}\mathbf x\big) =\frac{1}{2}A \mathbf x = \frac{1}{2} \lambda \mathbf x = \lambda \big(\frac{1}{2}\mathbf x\big)##

so ##\big(\frac{1}{2}\mathbf x\big)## is an eigenvector still with same eigenvalue. But the choice of rescaling was arbitrary. In fact it is you who must choose the 'best' scaling.

Since the matrix collecting the eigenvectors (call it ##\mathbf U##) must be unitary, you don't need to 'find' or 'compute' the inverse. If you did it right, the inverse is simply the conjugate transpose of ##\mathbf U##. No work needed.

After trying all day, I managed to finally compute (applying the proper scaling) the following transformation matrices to convert from ##K_i## to ##J_i##:

$$M_1=\frac{\sqrt{2}}{2}\left(\matrix{-1&0&1\\ 0&-i\sqrt{2}&0\\ 1&0&1}\right) \ \ \ so \ that \ J_1=(M_1)^{-1}K_1 M_1$$

$$M_2=\frac{\sqrt{2}}{2}\left(\matrix{0&1&-1\\ -\sqrt{2}&0&0\\ 0&1&1}\right) \ \ \ so \ that \ J_2=(M_2)^{-1}K_2 M_2$$

$$M_3=\sqrt{2}\left(\matrix{-\frac{i}{2}&\frac{1}{2}&0\\ 0&0&\frac{1}{\sqrt{2}}\\ -\frac{i}{2}&\frac{1}{2}&0}\right) \ \ \ so\ that \ J_3=(M_3)^{-1}K_3 M_3$$

Since all of these are unitary, any product of the three is also an unitary matrix (which I also tested). Moreover, the conjugate tranpose is equal to the inverse.

However, even combining the matrices into one doesn't give me the single correct ##M## that transforms all the ##K_i## into ##J_i##. Therefore, I'm still stuck how to build a single matrix to convert the base.

StoneTemplePython
Gold Member
After trying all day, I managed to finally compute (applying the proper scaling) the following transformation matrices to convert from ##K_i## to ##J_i##:

$$M_1=\frac{\sqrt{2}}{2}\left(\matrix{-1&0&1\\ 0&-i\sqrt{2}&0\\ 1&0&1}\right) \ \ \ so \ that \ J_1=(M_1)^{-1}K_1 M_1$$
Why did it take you all day? I gave you this matrix verbatim in post 12. The idea needed for ##M_2## and ##M_3## follows almost immediately.

However, even combining the matrices into one doesn't give me the single correct ##M## that transforms all the ##K_i## into ##J_i##. Therefore, I'm still stuck how to build a single matrix to convert the base.
As I've said before, I don't know what you are doing because you don't show your work. You need to write out the matrix algebra here for all this to actually explore the problem.

Why did it take you all day? I gave you this matrix verbatim in post 12. The idea needed for ##M_2## and ##M_3## follows almost immediately.
I verified everything from scratch and redid my calculations to be sure nothing was missing. Regarding the verbatim, I actually applied it directly but I didn't get the correct normalization:

##
M'_1:=\left(\matrix{-\frac{1}{2}&0&\frac{1}{2}\\0&-\frac{i}{2}&0\\ \frac{1}{2}&0&\frac{1}{2}}\right)
\left(\begin{matrix}\sqrt{2} & 0 & 0\\0 & 2 & 0\\0 & 0 & \sqrt{2}\end{matrix}\right)=\left(\matrix{-\frac{1}{\sqrt{2}}&0&0\\ 0&-i\sqrt{2}&0\\ 0&0&\frac{i}{\sqrt{2}}}\right)
##

Which is still non-unitary. Therefore I kept playing with the values until I made sure that my matrix was unitary (which I checked by comparing the inverse and the conjugate transpose). Moreover, I realized I actually was looking for the inverse transformation, that is, instead of going from ##J_i## to ##K_i## I had to go from ##K_i## to ##J_i##, so I recalculated matrices.

As I've said before, I don't know what you are doing because you don't show your work. You need to write out the matrix algebra here for all this to actually explore the problem.
Sorry. After getting the matrices before ##M_1, M_2, M_3##, which are unitary, I tried taking the following products (which are all unitary):

$$P_1=M_1M_2=\left(\matrix{0&0&1\\ -i&0&0\\ 0&1&0}\right)$$

$$P_2=M_2M_3=\left(\matrix{\frac{i}{\sqrt{2}}&-\frac{1}{2}&\frac{1}{\sqrt{2}}\\ -\frac{i}{\sqrt{2}}&-\frac{1}{\sqrt{2}}&0\\ -\frac{i}{2}&\frac{1}{2}&\frac{1}{\sqrt{2}}}\right)$$

$$P_3=M_1M_3=\left(\matrix{-i&0&0\\ 0&0&i\\ 0&1&0}\right)$$

$$P_4=M_1M_2M_3=\left(\matrix{-\frac{i}{\sqrt{2}}&\frac{1}{\sqrt{2}}&0\\ \frac{1}{\sqrt{2}}&-\frac{i}{\sqrt{2}}&0\\ 0&0&1}\right)$$

Next, to verify is one of this transforms all the matrices from one base to another, I applied a similarity transformation. To simplify, let's try to convert ##K_1## into ##J_1## using one of this matrices:

$$P_1 K_1 P_{1}^{-1}=\left(\matrix{0&0&\frac{1}{\sqrt{2}}\\ 0&0&-\frac{i}{\sqrt{2}}\\ \frac{1}{\sqrt{2}}&\frac{i}{\sqrt{2}}&0}\right)\neq J_1$$

$$P_2 K_2 P_{2}^{-1}=\left(\matrix{-\frac{1}{\sqrt{2}}&\frac{1}{4}(-2i-\sqrt{2})&0\\ \frac{1}{4}(2i-\sqrt{2})&0&\frac{1}{4}(-2i-\sqrt{2})\\ 0&\frac{1}{4}(2i-\sqrt{2})&\frac{1}{2}}\right)\neq J_1$$

$$P_3 K_3 P_{3}^{-1}=\left(\matrix{0&0&-\frac{i}{\sqrt{2}}\\ 0&0&\frac{i}{\sqrt{2}}\\ \frac{i}{\sqrt{2}}&-\frac{i}{\sqrt{2}}&0}\right)\neq J_1$$

$$P_4 K_4 P_{4}^{-1}=\left(\matrix{0&\frac{1}{\sqrt{2}}&\frac{1}{2}\\ \frac{1}{\sqrt{2}}&0&-\frac{i}{2}\\ \frac{1}{2}&\frac{i}{2}&0}\right)\neq J_1$$

Since none of the above satisfy at least the first transformation, they aren't what I'm looking for. It is worth noting that, individually, the matrix ##M_1## does convert ##K_1## to ##J_1##:

$$M_1 K_1 M_{1}^{-1}=\left(\matrix{0&0&0 \\ 0&0&i \\ 0&-i&0} \right)=J_1$$

But it doesn't convert, for example, ##K_2## into ##J_2##, so it still isn't the matrix I need:

$$M_1 K_2 M_{1}^{-1}=\left(\matrix{0&1&0 \\ 1&0&0 \\ 0&0&0} \right)\neq J_1$$

Last edited:
StoneTemplePython
Gold Member
I verified everything from scratch and redid my calculations to be sure nothing was missing. Regarding the verbatim, I actually applied it directly but I didn't get the correct normalization:

##
M'_1:=\left(\matrix{-\frac{1}{2}&0&\frac{1}{2}\\0&-\frac{i}{2}&0\\ \frac{1}{2}&0&\frac{1}{2}}\right)
\left(\begin{matrix}\sqrt{2} & 0 & 0\\0 & 2 & 0\\0 & 0 & \sqrt{2}\end{matrix}\right)=\left(\matrix{-\frac{1}{\sqrt{2}}&0&0\\ 0&-i\sqrt{2}&0\\ 0&0&\frac{i}{\sqrt{2}}}\right)
##

Which is still non-unitary. Therefore I kept playing with the values until I made sure that my matrix was unitary (which I checked by comparing the inverse and the conjugate transpose).
No. This is not how matrix multiplication works. There is something seriously wrong here in this thread.

No. This is not how matrix multiplication works. There is something seriously wrong here in this thread.
You're right, I apologize. Indeed, I don't know why I got that result in the first place, since I checked again (and even verified with a matrix calculator) and I actually got the following:

$$M'_1=\frac{1}{\sqrt{2}}\left(\matrix{-1&0&1\\ 0&-i\sqrt{2}&0\\ 1&0&1}\right)$$

Which is exactly the unitary matrix I obtained after perfoming a manual scaling.

StoneTemplePython