Tensor Product of Covariant and Contravariant Vectors

Click For Summary

Discussion Overview

The discussion revolves around the tensor product of covariant and contravariant vectors, exploring the mathematical operations involved and the conceptual understanding of tensors. Participants examine the definitions and representations of these vectors and their products, with a focus on the implications of their arrangements in matrix form.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants affirm that the tensor product can be performed on both contravariant and covariant vectors.
  • One participant describes the process of computing the tensor product of covariant vectors using matrix notation, suggesting that the product is represented as ATXB, where A and B are row vectors.
  • Another participant reiterates the same method for contravariant vectors, proposing the notation CXDT for their tensor product.
  • A participant emphasizes the importance of the arrangement of vectors, stating that the column vector must be on the left of the row vector to obtain the correct result.
  • One participant provides a more detailed conceptual framework, defining covariant vectors as linear functions on the tangent space and explaining the bilinear nature of their tensor product.
  • This participant also discusses the relationship between the tensor product and the basis for tangent and covariant vectors, introducing the notation for the coefficients in the tensor representation.
  • They conclude that the representation of the tensor product aligns with the previously mentioned matrix product formulation, suggesting consistency in their understanding.
  • Another participant shares their recent experience with a math tensor course, expressing enthusiasm for the topic.

Areas of Agreement / Disagreement

Participants generally agree on the feasibility of performing tensor products on both types of vectors, but there are varying interpretations and methods presented. The discussion includes multiple viewpoints on the correct approach and representation, indicating that the topic remains somewhat contested and unresolved.

Contextual Notes

Some participants' arguments depend on specific definitions of covariant and contravariant vectors, and the discussion does not fully resolve the mathematical steps involved in the tensor product operations.

meteor
Messages
937
Reaction score
0
It's possible to do the tensor product of two contravariant vectors?
It's possible to do the tensor product of two covariant vectors?
 
Last edited:
Physics news on Phys.org
Yes, and yes.

- Warren
 
Ok, I think it goes this way:
If you want to do the tensor product of two covariant vectors A and B,with its components represented by two row vectors, then you do ATXB, where X denotes matrix product and T denotes transpose, and the resulting matrix is the tensor product
Similarly, to do the tensor product of two contravariant vectors C and D, you do CXDT
Is this correct?
 
meteor said:
Ok, I think it goes this way:
If you want to do the tensor product of two covariant vectors A and B,with its components represented by two row vectors, then you do ATXB, where X denotes matrix product and T denotes transpose, and the resulting matrix is the tensor product
Similarly, to do the tensor product of two contravariant vectors C and D, you do CXDT
Is this correct?

As long as the column vector is to the left of the row vector, you'll get it right. Another way to think about it is to let the indices in the product:

xμxν

represent the addresses in the matrix representation. In other words, the component x2x3 is the matrix element in the 2nd row, 3rd column.
 
I have just finished a math tensor course, it's really a lot of fun once you get going!
 
I presume when you say covariant vectors you mean in the classical backwards terminology.

So a covariant vector is a linear function L on the tangent space, with values which are numbers, and likewise another such guy M is the same.

The tensor product of these two guys is a bilinear function LtensM acting on pairs of tangent vectors in the only sensible way, i.e. (LtensM)(v,w) =L(v)M(w), product of numbers.

Now if e1,...,er is a basis for tangent vectors and h1,...,hr is the dual basis for covariant vectors, where hj has value zero at all ei except hj(ej) = 1,

then the basic tensors (hj)tens(hk) are a basis for all the second order tensors.

Hence we must be able to write out LtensM in terms of these basic tensors.

I.e. there must be doubly indexed family Cjk of numbers such that

LtensM = summation Cjk (hj)tens(hk).

Now to learn what these numbers Cjk are, note that Cjk is the value of the right hand side on the contravariant tensor (ej)tens(ek). So to compute Cjk we just

apply the left hand side to this same tensor.

I.e. we should have Cjk = (LtensM)((ej)tens(ek)) = L(ej)M(ek).

Hence if the covariant vector L was represented by (...aj...), and M was represented by (...bk...), i.e. if L(ej) = aj, and M(ek) = bk, then L tens M is represented by the matrix product

(ajbk) = (aj)T (bk), which seems to agree with what was said above.

Now my argument for the conceptual approach is that although I did not know how to multiply tensors when I started this post, I still seem to have got it right because I knew what a tensor actually is.

Or at least if I got it wrong, you can follow what I did and see where I went wrong.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K