# Transformation of cross product

1. Mar 7, 2010

### jostpuur

If $A\in O(3)$ and $v,w\in\mathbb{R}^3$, then

$$(Av)\times (Aw) = (\textrm{det}\;A) A(v\times w)$$

is right, right?

Is there any simple way of proving this? It doesn't come easily like invariance of dot product. Isn't this equation also important for understanding the pseudovectors?

2. Mar 16, 2010

### Jmf

$$(Av)\times (Aw) = (\textrm{det}\;A) (A^{-1})^{T}(v\times w)$$

Is an identity, I believe. So for orthogonal A then what you put should be correct as it is a special case of the above.

I've spent 10 minutes trying to prove it though and I'm not getting anywhere, heh.

3. Mar 16, 2010

### jostpuur

Very interesting comment. That reminded me about the explicit formula for $A^{-1}$ in terms of the cofactors. I didn't try to work with them earlier, because I wasn't thinking about inverses.

4. Mar 16, 2010

### Jmf

EDIT: I've since realised that this proof only works if A is a normal matrix - i.e:

$$AA^*=A^*A$$

or

$$AA^T=A^TA$$

if A is real.

Since I've assumed that an eigenbasis of A exists. Of course this is true for the set of orthogonal matrices that you were considering originally though, as well as eg. the symmetric and skew-symmetric matrices, so the proof is still mainly valid.

__________________

Been thinking about this a while, and I think I have a proof.

For a while I tried to prove it by using the spectral decomposition $$A = MDM^T$$, but after some fruitless algebra I ended up just resigning myself to choosing a basis. So here goes.

Please note that I study engineering though, not maths, so this may not be totally rigorous, or even correct:

Let's (hopefully without any loss of generality) choose our basis of the 3-dimensional euclidean space to be the (normalised) eigenbasis from A, i.e. we're assuming that the vectors u and w can be written as a sum of the normalised eigenvectors of A.

With this choice of basis, our matrix A becomes diagonal:

$$A = \begin{bmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3 \end{bmatrix}$$

Where the lambdas are the eigenvalues of A. Now we are trying to prove:

$$(Av)\times (Aw) = |A| (A^{-1})^{T}(v\times w)$$

Note that:

$$|A| = \lambda_1\lambda_2\lambda_3$$

$$(A^{-1})^T = A^{-1} = \begin{bmatrix} 1\over\lambda_1 & 0 & 0 \\ 0 & 1\over\lambda_2 & 0 \\ 0 & 0 & 1\over\lambda_3 \end{bmatrix}$$

Substituting these in to the right, and doing the matrix multiplications on the left (easy as A is diagonal) we get:

$$\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} 1 \over \lambda_1 & 0 & 0 \\ 0& 1 \over \lambda_2 & 0 \\ 0 & 0 & 1 \over \lambda_3 \end{bmatrix} \begin{bmatrix} (v\times w)_1 \\ (v\times w)_2\\ (v\times w)_3 \end{bmatrix}$$

So:

$$\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} (v\times w)_1 \over \lambda_1 \\ (v\times w)_2 \over \lambda_2\\ (v\times w)_3 \over \lambda_3 \end{bmatrix}$$

hence:

$$\begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \begin{bmatrix} \lambda_2\lambda_3(v\times w)_1 \\ \lambda_1\lambda_3(v\times w)_2 \\ \lambda_1\lambda_2(v\times w)_3 \end{bmatrix}$$

NB: I'm using $$(v \times w)_n$$ to mean the n'th component of the cross product.

Then on performing these cross products we get:

$$\begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix} = \begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix}$$

Which completes the proof. Hopefully it doesn't matter that I've chosen a specific basis? I think the result should still apply generally. :)

Last edited: Mar 17, 2010
5. Mar 16, 2010

### jostpuur

I just managed to complete my proof too. You are only couple of minutes ahead of me.

I'll start typing now! (Proof coming in my next post)

6. Mar 16, 2010

### Jmf

Haha, brilliant. Can't wait. :tongue:

7. Mar 16, 2010

### jostpuur

A big remark is that the inverse matrix of a 3x3 matrix can be written in terms of cross products.

First check the formula for an inverse matrix:

$$A^{-1} = \frac{1}{\textrm{det}(A)} C^T$$

http://en.wikipedia.org/wiki/Cofactor_(linear_algebra)

With finite amount of effort it is possible to write the cofactor matrix $C$ as follows:

$$C = \big( A_{*2}\times A_{*3},\;A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)$$

Here a notation

$$A = \big(A_{*1}, A_{*2}, A_{*3}\big)$$

$$A_{*k} = \left(\begin{array}{c} A_{1k} \\ A_{2k} \\ A_{3k} \\ \end{array}\right)$$

is used.

Then:

$$(Av)\times (Aw) = \epsilon_{ijk} (Av)_i (Aw)_j e_k = \epsilon_{ijk} (A_{ii'}v_{i'}) (A_{jj'} w_{j'}) e_k = \big(\epsilon_{ijk} A_{ii'} A_{jj'} e_k\big) v_{i'} w_{j'}$$

$$= (A_{*i'} \times A_{*j'}) v_{i'} w_{j'} =(A_{*1}\times A_{*2})(v_1w_2 - v_2w_1) \;+\; (A_{*2}\times A_{*3})(v_2w_3 - v_3w_2) \;+\; (A_{*3}\times A_{*1})(v_3w_1 - v_1w_3)$$

$$= \big(A_{*2}\times A_{*3},\; A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big) \left(\begin{array}{c} (v\times w)_1 \\ (v\times w)_2 \\ (v\times w)_3 \\ \end{array}\right) = C(v\times w) = \textrm{det}(A) (A^{-1})^T (v\times w)$$

Last edited: Mar 17, 2010
8. Mar 16, 2010

### Jmf

Hm, yeah. I did try proving it somewhat like that at first and got bogged down in the algebra. I like it more than mine (using the epsilon symbol (tensor?) is inspired, I didn't think of that) because mine has the assumption that you _have_ an eigenbasis, which I'm not sure is entirely justified.

I can't see where your transpose has gone though. :)

EDIT: Ahh I see, well spotted :)

Last edited: Mar 16, 2010