Transformation of cross product

  • Thread starter jostpuur
  • Start date
  • #1
2,111
17

Main Question or Discussion Point

If [itex]A\in O(3)[/itex] and [itex]v,w\in\mathbb{R}^3[/itex], then

[tex]
(Av)\times (Aw) = (\textrm{det}\;A) A(v\times w)
[/tex]

is right, right?

Is there any simple way of proving this? It doesn't come easily like invariance of dot product. Isn't this equation also important for understanding the pseudovectors?
 

Answers and Replies

  • #2
Jmf
49
0
[tex]
(Av)\times (Aw) = (\textrm{det}\;A) (A^{-1})^{T}(v\times w)
[/tex]

Is an identity, I believe. So for orthogonal A then what you put should be correct as it is a special case of the above.

I've spent 10 minutes trying to prove it though and I'm not getting anywhere, heh.
 
  • #3
2,111
17
Very interesting comment. That reminded me about the explicit formula for [itex]A^{-1}[/itex] in terms of the cofactors. I didn't try to work with them earlier, because I wasn't thinking about inverses.
 
  • #4
Jmf
49
0
EDIT: I've since realised that this proof only works if A is a normal matrix - i.e:

[tex]AA^*=A^*A[/tex]

or

[tex]AA^T=A^TA[/tex]

if A is real.

Since I've assumed that an eigenbasis of A exists. Of course this is true for the set of orthogonal matrices that you were considering originally though, as well as eg. the symmetric and skew-symmetric matrices, so the proof is still mainly valid.

__________________

Been thinking about this a while, and I think I have a proof.

For a while I tried to prove it by using the spectral decomposition [tex]A = MDM^T[/tex], but after some fruitless algebra I ended up just resigning myself to choosing a basis. So here goes.

Please note that I study engineering though, not maths, so this may not be totally rigorous, or even correct:


Let's (hopefully without any loss of generality) choose our basis of the 3-dimensional euclidean space to be the (normalised) eigenbasis from A, i.e. we're assuming that the vectors u and w can be written as a sum of the normalised eigenvectors of A.

With this choice of basis, our matrix A becomes diagonal:

[tex]A = \begin{bmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3 \end{bmatrix}[/tex]

Where the lambdas are the eigenvalues of A. Now we are trying to prove:

[tex]
(Av)\times (Aw) = |A| (A^{-1})^{T}(v\times w)
[/tex]

Note that:

[tex] |A| = \lambda_1\lambda_2\lambda_3 [/tex]

[tex] (A^{-1})^T = A^{-1} = \begin{bmatrix} 1\over\lambda_1 & 0 & 0 \\ 0 & 1\over\lambda_2 & 0 \\ 0 & 0 & 1\over\lambda_3 \end{bmatrix}[/tex]

Substituting these in to the right, and doing the matrix multiplications on the left (easy as A is diagonal) we get:

[tex]
\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} 1 \over \lambda_1 & 0 & 0 \\ 0& 1 \over \lambda_2 & 0 \\ 0 & 0 & 1 \over \lambda_3 \end{bmatrix} \begin{bmatrix} (v\times w)_1 \\ (v\times w)_2\\ (v\times w)_3 \end{bmatrix}
[/tex]

So:

[tex]
\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} (v\times w)_1 \over \lambda_1 \\ (v\times w)_2 \over \lambda_2\\ (v\times w)_3 \over \lambda_3 \end{bmatrix}
[/tex]

hence:

[tex]
\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \begin{bmatrix} \lambda_2\lambda_3(v\times w)_1 \\ \lambda_1\lambda_3(v\times w)_2 \\ \lambda_1\lambda_2(v\times w)_3 \end{bmatrix}
[/tex]

NB: I'm using [tex](v \times w)_n[/tex] to mean the n'th component of the cross product.

Then on performing these cross products we get:

[tex]\begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix} = \begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix}[/tex]

Which completes the proof. Hopefully it doesn't matter that I've chosen a specific basis? I think the result should still apply generally. :)
 
Last edited:
  • #5
2,111
17
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)
 
  • #6
Jmf
49
0
I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

I'll start typing now! (Proof coming in my next post)
Haha, brilliant. Can't wait. :tongue:
 
  • #7
2,111
17
A big remark is that the inverse matrix of a 3x3 matrix can be written in terms of cross products.

First check the formula for an inverse matrix:

[tex]
A^{-1} = \frac{1}{\textrm{det}(A)} C^T
[/tex]

http://en.wikipedia.org/wiki/Cofactor_(linear_algebra)

With finite amount of effort it is possible to write the cofactor matrix [itex]C[/itex] as follows:

[tex]
C = \big( A_{*2}\times A_{*3},\;A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)
[/tex]

Here a notation

[tex]
A = \big(A_{*1}, A_{*2}, A_{*3}\big)
[/tex]

[tex]
A_{*k} = \left(\begin{array}{c}
A_{1k} \\ A_{2k} \\ A_{3k} \\
\end{array}\right)
[/tex]

is used.

Then:

[tex]
(Av)\times (Aw) = \epsilon_{ijk} (Av)_i (Aw)_j e_k = \epsilon_{ijk} (A_{ii'}v_{i'}) (A_{jj'} w_{j'}) e_k
= \big(\epsilon_{ijk} A_{ii'} A_{jj'} e_k\big) v_{i'} w_{j'}
[/tex]

[tex]
= (A_{*i'} \times A_{*j'}) v_{i'} w_{j'} =(A_{*1}\times A_{*2})(v_1w_2 - v_2w_1) \;+\; (A_{*2}\times A_{*3})(v_2w_3 - v_3w_2) \;+\; (A_{*3}\times A_{*1})(v_3w_1 - v_1w_3)
[/tex]

[tex]
= \big(A_{*2}\times A_{*3},\; A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)
\left(\begin{array}{c}
(v\times w)_1 \\ (v\times w)_2 \\ (v\times w)_3 \\
\end{array}\right) = C(v\times w) = \textrm{det}(A) (A^{-1})^T (v\times w)
[/tex]
 
Last edited:
  • #8
Jmf
49
0
Hm, yeah. I did try proving it somewhat like that at first and got bogged down in the algebra. I like it more than mine (using the epsilon symbol (tensor?) is inspired, I didn't think of that) because mine has the assumption that you _have_ an eigenbasis, which I'm not sure is entirely justified.

I can't see where your transpose has gone though. :)

EDIT: Ahh I see, well spotted :)
 
Last edited:

Related Threads on Transformation of cross product

Replies
59
Views
6K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
15
Views
3K
  • Last Post
Replies
12
Views
3K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
2
Views
4K
  • Last Post
Replies
2
Views
18K
  • Last Post
Replies
2
Views
5K
  • Last Post
2
Replies
46
Views
21K
Replies
11
Views
1K
Top