How Can We Prove the Transformation of Cross Product Under Orthogonal Matrices?

Click For Summary
SUMMARY

The discussion centers on proving the transformation of the cross product under orthogonal matrices, specifically the identity (Av) × (Aw) = (det A) A(v × w) for A ∈ O(3) and vectors v, w ∈ ℝ³. Participants explore various approaches, including spectral decomposition and the use of eigenbases, ultimately confirming the validity of the identity under the assumption that A is a normal matrix. The proof involves expressing A in a diagonal form and leveraging properties of determinants and inverses, leading to a comprehensive understanding of the relationship between orthogonal transformations and cross products.

PREREQUISITES
  • Understanding of orthogonal matrices (O(3))
  • Familiarity with cross products in three-dimensional space
  • Knowledge of determinants and matrix inverses
  • Basic concepts of eigenvalues and eigenvectors
NEXT STEPS
  • Study the properties of orthogonal matrices in linear algebra
  • Learn about spectral decomposition and its applications
  • Explore the relationship between determinants and matrix inverses
  • Investigate the role of pseudovectors in physics and engineering contexts
USEFUL FOR

Mathematicians, engineers, and physics students interested in linear transformations, particularly those studying the implications of orthogonal matrices on vector operations and cross products.

jostpuur
Messages
2,112
Reaction score
19
If A\in O(3) and v,w\in\mathbb{R}^3, then

<br /> (Av)\times (Aw) = (\textrm{det}\;A) A(v\times w)<br />

is right, right?

Is there any simple way of proving this? It doesn't come easily like invariance of dot product. Isn't this equation also important for understanding the pseudovectors?
 
Physics news on Phys.org
<br /> (Av)\times (Aw) = (\textrm{det}\;A) (A^{-1})^{T}(v\times w)<br />

Is an identity, I believe. So for orthogonal A then what you put should be correct as it is a special case of the above.

I've spent 10 minutes trying to prove it though and I'm not getting anywhere, heh.
 
Very interesting comment. That reminded me about the explicit formula for A^{-1} in terms of the cofactors. I didn't try to work with them earlier, because I wasn't thinking about inverses.
 
EDIT: I've since realized that this proof only works if A is a normal matrix - i.e:

AA^*=A^*A

or

AA^T=A^TA

if A is real.

Since I've assumed that an eigenbasis of A exists. Of course this is true for the set of orthogonal matrices that you were considering originally though, as well as eg. the symmetric and skew-symmetric matrices, so the proof is still mainly valid.

__________________

Been thinking about this a while, and I think I have a proof.

For a while I tried to prove it by using the spectral decomposition A = MDM^T, but after some fruitless algebra I ended up just resigning myself to choosing a basis. So here goes.

Please note that I study engineering though, not maths, so this may not be totally rigorous, or even correct:Let's (hopefully without any loss of generality) choose our basis of the 3-dimensional euclidean space to be the (normalised) eigenbasis from A, i.e. we're assuming that the vectors u and w can be written as a sum of the normalised eigenvectors of A.

With this choice of basis, our matrix A becomes diagonal:

A = \begin{bmatrix} \lambda_1 &amp; 0 &amp; 0 \\ 0 &amp; \lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; \lambda_3 \end{bmatrix}

Where the lambdas are the eigenvalues of A. Now we are trying to prove:

<br /> (Av)\times (Aw) = |A| (A^{-1})^{T}(v\times w)<br />

Note that:

|A| = \lambda_1\lambda_2\lambda_3

(A^{-1})^T = A^{-1} = \begin{bmatrix} 1\over\lambda_1 &amp; 0 &amp; 0 \\ 0 &amp; 1\over\lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; 1\over\lambda_3 \end{bmatrix}

Substituting these into the right, and doing the matrix multiplications on the left (easy as A is diagonal) we get:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} 1 \over \lambda_1 &amp; 0 &amp; 0 \\ 0&amp; 1 \over \lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; 1 \over \lambda_3 \end{bmatrix} \begin{bmatrix} (v\times w)_1 \\ (v\times w)_2\\ (v\times w)_3 \end{bmatrix}<br />

So:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} (v\times w)_1 \over \lambda_1 \\ (v\times w)_2 \over \lambda_2\\ (v\times w)_3 \over \lambda_3 \end{bmatrix}<br />

hence:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \begin{bmatrix} \lambda_2\lambda_3(v\times w)_1 \\ \lambda_1\lambda_3(v\times w)_2 \\ \lambda_1\lambda_2(v\times w)_3 \end{bmatrix}<br />

NB: I'm using (v \times w)_n to mean the n'th component of the cross product.

Then on performing these cross products we get:

\begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix} = \begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix}

Which completes the proof. Hopefully it doesn't matter that I've chosen a specific basis? I think the result should still apply generally. :)
 
Last edited:
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)
 
jostpuur said:
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)

Haha, brilliant. Can't wait. :-p
 
A big remark is that the inverse matrix of a 3x3 matrix can be written in terms of cross products.

First check the formula for an inverse matrix:

<br /> A^{-1} = \frac{1}{\textrm{det}(A)} C^T<br />

http://en.wikipedia.org/wiki/Cofactor_(linear_algebra)

With finite amount of effort it is possible to write the cofactor matrix C as follows:

<br /> C = \big( A_{*2}\times A_{*3},\;A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)<br />

Here a notation

<br /> A = \big(A_{*1}, A_{*2}, A_{*3}\big)<br />

<br /> A_{*k} = \left(\begin{array}{c}<br /> A_{1k} \\ A_{2k} \\ A_{3k} \\<br /> \end{array}\right)<br />

is used.

Then:

<br /> (Av)\times (Aw) = \epsilon_{ijk} (Av)_i (Aw)_j e_k = \epsilon_{ijk} (A_{ii&#039;}v_{i&#039;}) (A_{jj&#039;} w_{j&#039;}) e_k<br /> = \big(\epsilon_{ijk} A_{ii&#039;} A_{jj&#039;} e_k\big) v_{i&#039;} w_{j&#039;}<br />

<br /> = (A_{*i&#039;} \times A_{*j&#039;}) v_{i&#039;} w_{j&#039;} =(A_{*1}\times A_{*2})(v_1w_2 - v_2w_1) \;+\; (A_{*2}\times A_{*3})(v_2w_3 - v_3w_2) \;+\; (A_{*3}\times A_{*1})(v_3w_1 - v_1w_3)<br />

<br /> = \big(A_{*2}\times A_{*3},\; A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)<br /> \left(\begin{array}{c}<br /> (v\times w)_1 \\ (v\times w)_2 \\ (v\times w)_3 \\<br /> \end{array}\right) = C(v\times w) = \textrm{det}(A) (A^{-1})^T (v\times w)<br />
 
Last edited:
Hm, yeah. I did try proving it somewhat like that at first and got bogged down in the algebra. I like it more than mine (using the epsilon symbol (tensor?) is inspired, I didn't think of that) because mine has the assumption that you _have_ an eigenbasis, which I'm not sure is entirely justified.

I can't see where your transpose has gone though. :)

EDIT: Ahh I see, well spotted :)
 
Last edited:

Similar threads

  • · Replies 59 ·
2
Replies
59
Views
9K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
2
Views
3K