# Is Second rank tensor always tensor product of two vectors?

• I
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ? If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
2. How to find out ##A_i## and ##B_j##
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?

fresh_42
Mentor
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ?
No, it is a sum of such products.
If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
No. Even for dyadics ##A_i \otimes B_j## you always have ##A_i \otimes B_j = c \cdot A_i \otimes \frac{1}{c}B_j## for any scalar ##c \neq 0##.
2. How to find out ##A_i## and ##B_j##
##T_{ij}## is basically any matrix and ##A_i \otimes B_j## a matrix of rank ##1##. So write your matrix as a sum of rank-##1## matrices and you have a presentation.
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
No. See the rank explanation above.

QuantumQuest and arpon
haushofer
To add to Fresh's answer: this last remark already should make you suspicious. An arbitrary (!) second-rank tensor in three dimensions has 3*3=9 components, but two vectors have 2*3=6 components. What you can do, is to decompose a second rank tensor like $T_{ij}$ as
$$T_{ij} = T_{[ij]} + T_{(ij)}$$