I Is Second rank tensor always tensor product of two vectors?

  • Thread starter arpon
  • Start date
  • Tags
    tensor
236
16
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ? If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
2. How to find out ##A_i## and ##B_j##
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
 

fresh_42

Mentor
Insights Author
2018 Award
11,587
8,053
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ?
No, it is a sum of such products.
If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
No. Even for dyadics ##A_i \otimes B_j## you always have ##A_i \otimes B_j = c \cdot A_i \otimes \frac{1}{c}B_j## for any scalar ##c \neq 0##.
2. How to find out ##A_i## and ##B_j##
##T_{ij}## is basically any matrix and ##A_i \otimes B_j## a matrix of rank ##1##. So write your matrix as a sum of rank-##1## matrices and you have a presentation.
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
No. See the rank explanation above.
 

haushofer

Science Advisor
Insights Author
2,227
562
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
To add to Fresh's answer: this last remark already should make you suspicious. An arbitrary (!) second-rank tensor in three dimensions has 3*3=9 components, but two vectors have 2*3=6 components. What you can do, is to decompose a second rank tensor like [itex]T_{ij}[/itex] as

$$T_{ij} = T_{[ij]} + T_{(ij)} $$

where [ij] stands for antisymmetrization, whereas (ij) stands for symmetrization. Both parts transform independently under coordinate transfo's. The antisymmetric part has 3 independent components, whereas the symmetric part has 6 components. You can even go further in this decomposition, because the trace of the tensor components also does not change under a coordinate transformation.
 

Want to reply to this thread?

"Is Second rank tensor always tensor product of two vectors?" You must log in or register to reply here.

Related Threads for: Is Second rank tensor always tensor product of two vectors?

Replies
9
Views
1K
  • Posted
Replies
3
Views
2K
Replies
1
Views
799
Replies
9
Views
2K
Replies
1
Views
838
Replies
1
Views
1K
  • Posted
Replies
7
Views
2K
  • Posted
Replies
9
Views
3K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top