Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Tensor product

  1. Apr 17, 2004 #1
    It's possible to do the tensor product of two contravariant vectors?
    It's possible to do the tensor product of two covariant vectors?
    Last edited: Apr 17, 2004
  2. jcsd
  3. Apr 17, 2004 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Yes, and yes.

    - Warren
  4. Apr 22, 2004 #3
    Ok, I think it goes this way:
    If you want to do the tensor product of two covariant vectors A and B,with its components represented by two row vectors, then you do ATXB, where X denotes matrix product and T denotes transpose, and the resulting matrix is the tensor product
    Similarly, to do the tensor product of two contravariant vectors C and D, you do CXDT
    Is this correct?
  5. May 30, 2004 #4

    Tom Mattson

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    As long as the column vector is to the left of the row vector, you'll get it right. Another way to think about it is to let the indices in the product:


    represent the addresses in the matrix representation. In other words, the component x2x3 is the matrix element in the 2nd row, 3rd column.
  6. Jun 30, 2004 #5
    I have just finished a math tensor course, it's really a lot of fun once you get going!
  7. Aug 10, 2004 #6


    User Avatar
    Science Advisor
    Homework Helper
    2015 Award

    I presume when you say covariant vectors you mean in the classical backwards terminology.

    So a covariant vector is a linear function L on the tangent space, with values which are numbers, and likewise another such guy M is the same.

    The tensor product of these two guys is a bilinear function LtensM acting on pairs of tangent vectors in the only sensible way, i.e. (LtensM)(v,w) =L(v)M(w), product of numbers.

    Now if e1,....,er is a basis for tangent vectors and h1,...,hr is the dual basis for covariant vectors, where hj has value zero at all ei except hj(ej) = 1,

    then the basic tensors (hj)tens(hk) are a basis for all the second order tensors.

    Hence we must be able to write out LtensM in terms of these basic tensors.

    I.e. there must be doubly indexed family Cjk of numbers such that

    LtensM = summation Cjk (hj)tens(hk).

    Now to learn what these numbers Cjk are, note that Cjk is the value of the right hand side on the contravariant tensor (ej)tens(ek). So to compute Cjk we just

    apply the left hand side to this same tensor.

    I.e. we should have Cjk = (LtensM)((ej)tens(ek)) = L(ej)M(ek).

    Hence if the covariant vector L was represented by (....aj...), and M was represented by (...bk....), i.e. if L(ej) = aj, and M(ek) = bk, then L tens M is represented by the matrix product

    (ajbk) = (aj)T (bk), which seems to agree with what was said above.

    Now my argument for the conceptual approach is that although I did not know how to multiply tensors when I started this post, I still seem to have got it right because I knew what a tensor actually is.

    Or at least if I got it wrong, you can follow what I did and see where I went wrong.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Tensor product
  1. The Tensor Product (Replies: 14)

  2. Tensor product (Replies: 1)

  3. Tensor product (Replies: 17)

  4. Tensor Product (Replies: 3)

  5. Tensor products (Replies: 10)