Is Second rank tensor always tensor product of two vectors?

  • #1
236
16
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ? If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
2. How to find out ##A_i## and ##B_j##
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
 

Answers and Replies

  • #2
14,793
12,180
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ?
No, it is a sum of such products.
If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
No. Even for dyadics ##A_i \otimes B_j## you always have ##A_i \otimes B_j = c \cdot A_i \otimes \frac{1}{c}B_j## for any scalar ##c \neq 0##.
2. How to find out ##A_i## and ##B_j##
##T_{ij}## is basically any matrix and ##A_i \otimes B_j## a matrix of rank ##1##. So write your matrix as a sum of rank-##1## matrices and you have a presentation.
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
No. See the rank explanation above.
 
  • Like
Likes QuantumQuest and arpon
  • #3
haushofer
Science Advisor
Insights Author
2,506
903
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
To add to Fresh's answer: this last remark already should make you suspicious. An arbitrary (!) second-rank tensor in three dimensions has 3*3=9 components, but two vectors have 2*3=6 components. What you can do, is to decompose a second rank tensor like [itex]T_{ij}[/itex] as

$$T_{ij} = T_{[ij]} + T_{(ij)} $$

where [ij] stands for antisymmetrization, whereas (ij) stands for symmetrization. Both parts transform independently under coordinate transfo's. The antisymmetric part has 3 independent components, whereas the symmetric part has 6 components. You can even go further in this decomposition, because the trace of the tensor components also does not change under a coordinate transformation.
 

Related Threads on Is Second rank tensor always tensor product of two vectors?

  • Last Post
Replies
9
Views
391
Replies
9
Views
2K
  • Last Post
Replies
3
Views
2K
Replies
1
Views
918
Replies
9
Views
4K
  • Last Post
Replies
7
Views
2K
  • Last Post
2
Replies
34
Views
4K
  • Last Post
Replies
9
Views
3K
  • Last Post
Replies
6
Views
4K
  • Last Post
Replies
8
Views
3K
Top