How Can We Prove the Transformation of Cross Product Under Orthogonal Matrices?

Click For Summary

Discussion Overview

The discussion revolves around proving the transformation of the cross product under orthogonal matrices, specifically examining the equation (Av) × (Aw) = (det A) A(v × w) for vectors v and w in R³. Participants explore various approaches to establish this relationship, considering both theoretical and mathematical aspects.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant proposes the equation (Av) × (Aw) = (det A) A(v × w) and seeks a simple proof, noting its importance for understanding pseudovectors.
  • Another participant suggests an alternative identity involving the inverse of A, stating that for orthogonal A, the original equation holds as a special case.
  • A later reply mentions the explicit formula for the inverse of a matrix in terms of cofactors, indicating a shift in focus to the properties of normal matrices.
  • One participant reflects on the necessity of assuming an eigenbasis for A and discusses the implications of this assumption on the validity of their proof.
  • Another participant shares their proof, which involves choosing a specific basis and diagonalizing the matrix A, while expressing uncertainty about the general applicability of their approach.
  • A participant highlights the relationship between the inverse of a 3x3 matrix and cross products, providing a detailed derivation that connects these concepts.
  • Some participants express admiration for each other's methods, indicating a collaborative atmosphere despite differing approaches.
  • Concerns are raised about the assumptions made regarding the existence of an eigenbasis and the rigor of the proofs presented.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the proof or the assumptions required for the transformation of the cross product under orthogonal matrices. Multiple competing views and methods are presented, with some participants refining their arguments in response to others.

Contextual Notes

Limitations include the dependence on the assumption of an eigenbasis for A and the potential lack of rigor in some proofs, as noted by participants. The discussion also highlights the complexity of proving the transformation without simplifying assumptions.

jostpuur
Messages
2,112
Reaction score
19
If A\in O(3) and v,w\in\mathbb{R}^3, then

<br /> (Av)\times (Aw) = (\textrm{det}\;A) A(v\times w)<br />

is right, right?

Is there any simple way of proving this? It doesn't come easily like invariance of dot product. Isn't this equation also important for understanding the pseudovectors?
 
Physics news on Phys.org
<br /> (Av)\times (Aw) = (\textrm{det}\;A) (A^{-1})^{T}(v\times w)<br />

Is an identity, I believe. So for orthogonal A then what you put should be correct as it is a special case of the above.

I've spent 10 minutes trying to prove it though and I'm not getting anywhere, heh.
 
Very interesting comment. That reminded me about the explicit formula for A^{-1} in terms of the cofactors. I didn't try to work with them earlier, because I wasn't thinking about inverses.
 
EDIT: I've since realized that this proof only works if A is a normal matrix - i.e:

AA^*=A^*A

or

AA^T=A^TA

if A is real.

Since I've assumed that an eigenbasis of A exists. Of course this is true for the set of orthogonal matrices that you were considering originally though, as well as eg. the symmetric and skew-symmetric matrices, so the proof is still mainly valid.

__________________

Been thinking about this a while, and I think I have a proof.

For a while I tried to prove it by using the spectral decomposition A = MDM^T, but after some fruitless algebra I ended up just resigning myself to choosing a basis. So here goes.

Please note that I study engineering though, not maths, so this may not be totally rigorous, or even correct:Let's (hopefully without any loss of generality) choose our basis of the 3-dimensional euclidean space to be the (normalised) eigenbasis from A, i.e. we're assuming that the vectors u and w can be written as a sum of the normalised eigenvectors of A.

With this choice of basis, our matrix A becomes diagonal:

A = \begin{bmatrix} \lambda_1 &amp; 0 &amp; 0 \\ 0 &amp; \lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; \lambda_3 \end{bmatrix}

Where the lambdas are the eigenvalues of A. Now we are trying to prove:

<br /> (Av)\times (Aw) = |A| (A^{-1})^{T}(v\times w)<br />

Note that:

|A| = \lambda_1\lambda_2\lambda_3

(A^{-1})^T = A^{-1} = \begin{bmatrix} 1\over\lambda_1 &amp; 0 &amp; 0 \\ 0 &amp; 1\over\lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; 1\over\lambda_3 \end{bmatrix}

Substituting these into the right, and doing the matrix multiplications on the left (easy as A is diagonal) we get:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} 1 \over \lambda_1 &amp; 0 &amp; 0 \\ 0&amp; 1 \over \lambda_2 &amp; 0 \\ 0 &amp; 0 &amp; 1 \over \lambda_3 \end{bmatrix} \begin{bmatrix} (v\times w)_1 \\ (v\times w)_2\\ (v\times w)_3 \end{bmatrix}<br />

So:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} (v\times w)_1 \over \lambda_1 \\ (v\times w)_2 \over \lambda_2\\ (v\times w)_3 \over \lambda_3 \end{bmatrix}<br />

hence:

<br /> \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \begin{bmatrix} \lambda_2\lambda_3(v\times w)_1 \\ \lambda_1\lambda_3(v\times w)_2 \\ \lambda_1\lambda_2(v\times w)_3 \end{bmatrix}<br />

NB: I'm using (v \times w)_n to mean the n'th component of the cross product.

Then on performing these cross products we get:

\begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix} = \begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix}

Which completes the proof. Hopefully it doesn't matter that I've chosen a specific basis? I think the result should still apply generally. :)
 
Last edited:
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)
 
jostpuur said:
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)

Haha, brilliant. Can't wait. :-p
 
A big remark is that the inverse matrix of a 3x3 matrix can be written in terms of cross products.

First check the formula for an inverse matrix:

<br /> A^{-1} = \frac{1}{\textrm{det}(A)} C^T<br />

http://en.wikipedia.org/wiki/Cofactor_(linear_algebra)

With finite amount of effort it is possible to write the cofactor matrix C as follows:

<br /> C = \big( A_{*2}\times A_{*3},\;A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)<br />

Here a notation

<br /> A = \big(A_{*1}, A_{*2}, A_{*3}\big)<br />

<br /> A_{*k} = \left(\begin{array}{c}<br /> A_{1k} \\ A_{2k} \\ A_{3k} \\<br /> \end{array}\right)<br />

is used.

Then:

<br /> (Av)\times (Aw) = \epsilon_{ijk} (Av)_i (Aw)_j e_k = \epsilon_{ijk} (A_{ii&#039;}v_{i&#039;}) (A_{jj&#039;} w_{j&#039;}) e_k<br /> = \big(\epsilon_{ijk} A_{ii&#039;} A_{jj&#039;} e_k\big) v_{i&#039;} w_{j&#039;}<br />

<br /> = (A_{*i&#039;} \times A_{*j&#039;}) v_{i&#039;} w_{j&#039;} =(A_{*1}\times A_{*2})(v_1w_2 - v_2w_1) \;+\; (A_{*2}\times A_{*3})(v_2w_3 - v_3w_2) \;+\; (A_{*3}\times A_{*1})(v_3w_1 - v_1w_3)<br />

<br /> = \big(A_{*2}\times A_{*3},\; A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)<br /> \left(\begin{array}{c}<br /> (v\times w)_1 \\ (v\times w)_2 \\ (v\times w)_3 \\<br /> \end{array}\right) = C(v\times w) = \textrm{det}(A) (A^{-1})^T (v\times w)<br />
 
Last edited:
Hm, yeah. I did try proving it somewhat like that at first and got bogged down in the algebra. I like it more than mine (using the epsilon symbol (tensor?) is inspired, I didn't think of that) because mine has the assumption that you _have_ an eigenbasis, which I'm not sure is entirely justified.

I can't see where your transpose has gone though. :)

EDIT: Ahh I see, well spotted :)
 
Last edited:

Similar threads

  • · Replies 59 ·
2
Replies
59
Views
10K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
2
Views
3K