Rotational invariance of cross product matrix operator

  • #1
Filip Larsen
Gold Member
1,893
806
TL;DR Summary
Do rotational invariance of the vector cross product also carry over to the cross product matrix operator?
Given that the normal vector cross product is rotational invariant, that is $$\mathbf R(a\times b) = (\mathbf R a)\times(\mathbf R b),$$ where ##a, b \in \mathbb{R}^3## are two arbitrary (column) vectors and ##\mathbf R## is a 3x3 rotation matrix, and given the cross product matrix operator defined by $$ \left[a\right]_\times = \begin{bmatrix} 0 & -a_3 & a_2 \\ a_3 & 0 & -a_1 \\ -a_2 & a_1 & 0 \end{bmatrix} ,$$ such that ##a \times b = \left[a\right]_\times b##, my question now is if rotational invariance also applies for this operator, that is if it in general holds that $$\mathbf R \left[ a \right]_\times \stackrel{?}{=} \left[ \mathbf R a \right]_\times \mathbf R$$ Specifically for my current use, with ##\mathbf C## being 3x3 (positive semi-definite) matrix and utilizing ##\mathbf R^{-1} = \mathbf R^T## holds for a rotation matrix can I then conclude that ## \left[\mathbf R a \right]_\times \mathbf C \left[ \mathbf R a \right]_\times^T = \left( \mathbf R \left[ a \right]_\times \mathbf R^T \right) \mathbf C \left(\mathbf R \left[ a \right]_\times \mathbf R^T \right)^T = \left( \mathbf R \left[ a \right]_\times \right) \left( \mathbf R^T \mathbf C \mathbf R \right) \left( \mathbf R \left[ a \right]_\times \right)^T## always holds, as I am inclined to believe?

My (engineering) intuition tries to tell me that since the relation with the question marks holds when applied to a column vector, due to ##\left( \mathbf R \left[ a \right]_\times \right) b = \mathbf R \left( a\times b \right) = \left(\mathbf R a\right) \times \left(\mathbf R b\right) = \left( \left[ \mathbf R a \right]_\times\right) \left( \mathbf R b \right) = \left( \left[ \mathbf R a \right]_\times \mathbf R \right) b##, and since an equation involving multiplication of a 3x3 matrix can be separated into 3 equations with multiplication of each column vector of the matrix, then the relation must also hold when combined back into a general 3x3 matrix, but I worry if such math hand waving has math holes in it my engineering intuition can't see.
 
Physics news on Phys.org
  • #2
If we had the conjugation as operation: ##a\longmapsto \mathbf{R}a\mathbf{R}^{-1}## then invariance would follow directly by the Lie group ##SO(3)## operation on its Lie algebra.

If we only have a rotation once on the left, I think the quickest way is simply to check it. The situation is symmetric in all coordinates, so it is sufficient to check a rotation with one given angle around the z−axis. Thus you get either a proof or a counterexample: ##\mathbf{R}_z(\varphi)([a]_{\times} b) \stackrel{?}{=} (\mathbf{R}_z)(\varphi)a)_\times (\mathbf{R}_z\varphi)b) .##
 
  • #3
fresh_42 said:
Thus you get either a proof or a counterexample: ##\mathbf{R}_z(\varphi)([a]_{\times} b) \stackrel{?}{=} (\mathbf{R}_z)(\varphi)a)_\times (\mathbf{R}_z\varphi)b) ##.

In my answer below assume you mean ##\mathbf{R}_z(\varphi)([a]_{\times} b) \stackrel{?}{=} [\mathbf{R}_z(\varphi)a]_\times (\mathbf{R}_z(\varphi)b)##, with ##b## being a 3x3 matrix.

I was rather hoping not having to write out the full equations in scalars, even if it only involves rotation around a single axis. I know I am more or less repeating my question from before, but to prove it when ##b## is a matrix would it then be sufficient to decompose ##b## into column vectors, apply the relation known to be true when ##b## is a vector, and the assemble it back again to show you end up with the same final vector?
 
  • #4
Filip Larsen said:
In my answer below assume you mean ##\mathbf{R}_z(\varphi)([a]_{\times} b) \stackrel{?}{=} [\mathbf{R}_z(\varphi)a]_\times (\mathbf{R}_z(\varphi)b)##, with ##b## being a 3x3 matrix.
No, ##b## remains a vector.
I was rather hoping not having to write out the full equations in scalars, even if it only involves rotation around a single axis.
It makes two matrix multiplications and two matrix times vector multiplications. That's not too many to do. As I said, the natural operation would be a conjugation in which case there is nothing left to prove. But in that case we would have (in Lie algebra notation) ##R[a,b]R^{-1}=RabR^{-1}-RbaR^{-1}##, i.e. the rotations in the middle cancel. If they don't as in your case, then symmetry will be lost: ##Rab-Rba\stackrel{?}{=}RaRb-RbRa##. This means ##ab-ba=a\times b \stackrel{?}{=}aRb-bRa##. This doesn't look true, although intuition say it is. I would look for a counterexample, that is, you have to do the calculation anyway.
I know I am more or less repeating my question from before, but to prove it when ##b## is a matrix would it then be sufficient to decompose ##b## into column vectors, apply the relation known to be true when ##b## is a vector, and the assemble it back again to show you end up with the same final vector?
I don't see how ##b## is a matrix.
 
  • #5
fresh_42 said:
I don't see how ##b## is a matrix.

Let me go with that first as I suspect we are perhaps misunderstanding each other here.

As I tried to indicate in my first post, I know from the properties of the cross product matrix operator ##\left[\cdot\right]_\times## and from the rotational invariance of the vector cross product that $$ \begin{align}
\mathbf R \left[a\right]_\times b & = \mathbf R \left( \left[a\right]_\times b \right) \nonumber \\
& = \mathbf R(a\times b) \nonumber \\
& = (\mathbf R a)\times(\mathbf R b) \nonumber \\
& = \left[ \mathbf R a \right]_\times (\mathbf R b) \nonumber \\
& = \left( \left[ \mathbf R a \right]_\times \mathbf R \right) b \nonumber
\end{align} ,$$ where as before ##a, b## are vectors and ##\mathbf R## is a rotation matrix. This implies (as far as I can see) that $$\mathbf R \left[a\right]_\times = \left[ \mathbf R a \right]_\times \mathbf R$$ is a valid matrix equality. So, if you are suggesting that I prove ## \mathbf{R}_z(\varphi)([a]_{\times} b) \stackrel{?}{=} [\mathbf{R}_z(\varphi)a]_\times (\mathbf{R}_z(\varphi)b)## with ##b## being a vector then I would claim I already known this to be true.

What I am asking, to repeat, is if there are any reason why the same shouldn't also hold as I expect when applied to a 3x3 matrix ##\mathbf B##, that is if there is any reason why $$ \mathbf{R}([a]_{\times}\mathbf B) = [\mathbf{R}a]_\times \mathbf{R}\mathbf B$$ or, even more to the point regarding my intended use of this, if $$ \left[\mathbf R a \right]_\times \mathbf C \left[ \mathbf R a \right]_\times^T = \left( \mathbf R \left[ a \right]_\times \right) \left( \mathbf R^T \mathbf C \mathbf R \right) \left( \mathbf R \left[ a \right]_\times \right)^T $$ holds.
 
  • #6
How is the third equation true? I thought that that was the problem. If it is, then you can of course multiply it by any matrix that you want.
 
  • Like
Likes Filip Larsen
  • #7
fresh_42 said:
How is the third equation true?

If you mean ##\mathbf R(a \times b) = (\mathbf Ra)\times(\mathbf Rb)## then that is true due to the rotational invariance of the cross product.

fresh_42 said:
If it is, then you can of course multiply it by any matrix that you want.

OK, assuming this is a real math rated "of course" then I guess I was just worried about nothing :smile:
 
  • Like
Likes fresh_42
  • #8
Filip Larsen said:
OK, assuming this is a real math rated "of course" then I guess I was just worried about nothing :smile:
Yep.
 

Similar threads

Replies
7
Views
514
Replies
0
Views
1K
Replies
1
Views
1K
Replies
10
Views
2K
Replies
6
Views
2K
Replies
2
Views
2K
Replies
1
Views
672
Back
Top