Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Transformation of cross product

  1. Mar 7, 2010 #1
    If [itex]A\in O(3)[/itex] and [itex]v,w\in\mathbb{R}^3[/itex], then

    [tex]
    (Av)\times (Aw) = (\textrm{det}\;A) A(v\times w)
    [/tex]

    is right, right?

    Is there any simple way of proving this? It doesn't come easily like invariance of dot product. Isn't this equation also important for understanding the pseudovectors?
     
  2. jcsd
  3. Mar 16, 2010 #2

    Jmf

    User Avatar

    [tex]
    (Av)\times (Aw) = (\textrm{det}\;A) (A^{-1})^{T}(v\times w)
    [/tex]

    Is an identity, I believe. So for orthogonal A then what you put should be correct as it is a special case of the above.

    I've spent 10 minutes trying to prove it though and I'm not getting anywhere, heh.
     
  4. Mar 16, 2010 #3
    Very interesting comment. That reminded me about the explicit formula for [itex]A^{-1}[/itex] in terms of the cofactors. I didn't try to work with them earlier, because I wasn't thinking about inverses.
     
  5. Mar 16, 2010 #4

    Jmf

    User Avatar

    EDIT: I've since realised that this proof only works if A is a normal matrix - i.e:

    [tex]AA^*=A^*A[/tex]

    or

    [tex]AA^T=A^TA[/tex]

    if A is real.

    Since I've assumed that an eigenbasis of A exists. Of course this is true for the set of orthogonal matrices that you were considering originally though, as well as eg. the symmetric and skew-symmetric matrices, so the proof is still mainly valid.

    __________________

    Been thinking about this a while, and I think I have a proof.

    For a while I tried to prove it by using the spectral decomposition [tex]A = MDM^T[/tex], but after some fruitless algebra I ended up just resigning myself to choosing a basis. So here goes.

    Please note that I study engineering though, not maths, so this may not be totally rigorous, or even correct:


    Let's (hopefully without any loss of generality) choose our basis of the 3-dimensional euclidean space to be the (normalised) eigenbasis from A, i.e. we're assuming that the vectors u and w can be written as a sum of the normalised eigenvectors of A.

    With this choice of basis, our matrix A becomes diagonal:

    [tex]A = \begin{bmatrix} \lambda_1 & 0 & 0 \\ 0 & \lambda_2 & 0 \\ 0 & 0 & \lambda_3 \end{bmatrix}[/tex]

    Where the lambdas are the eigenvalues of A. Now we are trying to prove:

    [tex]
    (Av)\times (Aw) = |A| (A^{-1})^{T}(v\times w)
    [/tex]

    Note that:

    [tex] |A| = \lambda_1\lambda_2\lambda_3 [/tex]

    [tex] (A^{-1})^T = A^{-1} = \begin{bmatrix} 1\over\lambda_1 & 0 & 0 \\ 0 & 1\over\lambda_2 & 0 \\ 0 & 0 & 1\over\lambda_3 \end{bmatrix}[/tex]

    Substituting these in to the right, and doing the matrix multiplications on the left (easy as A is diagonal) we get:

    [tex]
    \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} 1 \over \lambda_1 & 0 & 0 \\ 0& 1 \over \lambda_2 & 0 \\ 0 & 0 & 1 \over \lambda_3 \end{bmatrix} \begin{bmatrix} (v\times w)_1 \\ (v\times w)_2\\ (v\times w)_3 \end{bmatrix}
    [/tex]

    So:

    [tex]
    \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \lambda_1\lambda_2\lambda_3 \begin{bmatrix} (v\times w)_1 \over \lambda_1 \\ (v\times w)_2 \over \lambda_2\\ (v\times w)_3 \over \lambda_3 \end{bmatrix}
    [/tex]

    hence:

    [tex]
    \begin{bmatrix} \lambda_1 v_1 \\ \lambda_2 v_2 \\ \lambda_3 v_3 \end{bmatrix} \times \begin{bmatrix} \lambda_1 w_1 \\ \lambda_2 w_2 \\ \lambda_3 w_3 \end{bmatrix} = \begin{bmatrix} \lambda_2\lambda_3(v\times w)_1 \\ \lambda_1\lambda_3(v\times w)_2 \\ \lambda_1\lambda_2(v\times w)_3 \end{bmatrix}
    [/tex]

    NB: I'm using [tex](v \times w)_n[/tex] to mean the n'th component of the cross product.

    Then on performing these cross products we get:

    [tex]\begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix} = \begin{bmatrix} \lambda_2 \lambda_3 (v_2 w_3 - v_3 w_2) \\ \lambda_1 \lambda_3 (v_3 w_1 - v_1 w_3) \\ \lambda_1 \lambda_2 (v_1 w_2 - v_2 w_1) \end{bmatrix}[/tex]

    Which completes the proof. Hopefully it doesn't matter that I've chosen a specific basis? I think the result should still apply generally. :)
     
    Last edited: Mar 17, 2010
  6. Mar 16, 2010 #5
    I just managed to complete my proof too. You are only couple of minutes ahead of me. :wink:

    I'll start typing now! (Proof coming in my next post)
     
  7. Mar 16, 2010 #6

    Jmf

    User Avatar

    Haha, brilliant. Can't wait. :tongue:
     
  8. Mar 16, 2010 #7
    A big remark is that the inverse matrix of a 3x3 matrix can be written in terms of cross products.

    First check the formula for an inverse matrix:

    [tex]
    A^{-1} = \frac{1}{\textrm{det}(A)} C^T
    [/tex]

    http://en.wikipedia.org/wiki/Cofactor_(linear_algebra)

    With finite amount of effort it is possible to write the cofactor matrix [itex]C[/itex] as follows:

    [tex]
    C = \big( A_{*2}\times A_{*3},\;A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)
    [/tex]

    Here a notation

    [tex]
    A = \big(A_{*1}, A_{*2}, A_{*3}\big)
    [/tex]

    [tex]
    A_{*k} = \left(\begin{array}{c}
    A_{1k} \\ A_{2k} \\ A_{3k} \\
    \end{array}\right)
    [/tex]

    is used.

    Then:

    [tex]
    (Av)\times (Aw) = \epsilon_{ijk} (Av)_i (Aw)_j e_k = \epsilon_{ijk} (A_{ii'}v_{i'}) (A_{jj'} w_{j'}) e_k
    = \big(\epsilon_{ijk} A_{ii'} A_{jj'} e_k\big) v_{i'} w_{j'}
    [/tex]

    [tex]
    = (A_{*i'} \times A_{*j'}) v_{i'} w_{j'} =(A_{*1}\times A_{*2})(v_1w_2 - v_2w_1) \;+\; (A_{*2}\times A_{*3})(v_2w_3 - v_3w_2) \;+\; (A_{*3}\times A_{*1})(v_3w_1 - v_1w_3)
    [/tex]

    [tex]
    = \big(A_{*2}\times A_{*3},\; A_{*3}\times A_{*1},\; A_{*1}\times A_{*2}\big)
    \left(\begin{array}{c}
    (v\times w)_1 \\ (v\times w)_2 \\ (v\times w)_3 \\
    \end{array}\right) = C(v\times w) = \textrm{det}(A) (A^{-1})^T (v\times w)
    [/tex]
     
    Last edited: Mar 17, 2010
  9. Mar 16, 2010 #8

    Jmf

    User Avatar

    Hm, yeah. I did try proving it somewhat like that at first and got bogged down in the algebra. I like it more than mine (using the epsilon symbol (tensor?) is inspired, I didn't think of that) because mine has the assumption that you _have_ an eigenbasis, which I'm not sure is entirely justified.

    I can't see where your transpose has gone though. :)

    EDIT: Ahh I see, well spotted :)
     
    Last edited: Mar 16, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Transformation of cross product
  1. Cross product (Replies: 15)

  2. Cross product! (Replies: 3)

Loading...