Undergrad Is Second rank tensor always tensor product of two vectors?

Click For Summary
SUMMARY

A second rank tensor \( T_{ij} \) cannot always be expressed as the tensor product of two vectors \( T_{ij} = A_i B_j \). The vectors \( A_i \) and \( B_j \) are not unique, as any scalar multiple of \( A_i \) and its corresponding adjustment of \( B_j \) will yield the same tensor product. To find \( A_i \) and \( B_j \), one must decompose the tensor into a sum of rank-1 matrices. Additionally, while a second rank tensor in three dimensions has 9 components, two vectors only account for 6 components, highlighting the need for a more complex representation.

PREREQUISITES
  • Understanding of second rank tensors
  • Familiarity with tensor products
  • Knowledge of matrix decomposition techniques
  • Basic concepts of symmetry and antisymmetry in tensors
NEXT STEPS
  • Study tensor decomposition methods, focusing on rank-1 matrices
  • Explore the properties of symmetric and antisymmetric tensors
  • Learn about coordinate transformations and their effects on tensor components
  • Investigate the role of tensor rank in physical applications
USEFUL FOR

Mathematicians, physicists, and engineers working with tensor analysis, particularly those involved in continuum mechanics or general relativity.

arpon
Messages
234
Reaction score
16
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ? If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
2. How to find out ##A_i## and ##B_j##
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
 
Mathematics news on Phys.org
arpon said:
Suppose a second rank tensor ##T_{ij}## is given. Can we always express it as the tensor product of two vectors, i.e., ##T_{ij}=A_{i}B_{j}## ?
No, it is a sum of such products.
If so, then I have a few more questions:
1. Are those two vectors ##A_i## and ##B_j## unique?
No. Even for dyadics ##A_i \otimes B_j## you always have ##A_i \otimes B_j = c \cdot A_i \otimes \frac{1}{c}B_j## for any scalar ##c \neq 0##.
2. How to find out ##A_i## and ##B_j##
##T_{ij}## is basically any matrix and ##A_i \otimes B_j## a matrix of rank ##1##. So write your matrix as a sum of rank-##1## matrices and you have a presentation.
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
No. See the rank explanation above.
 
  • Like
Likes QuantumQuest and arpon
arpon said:
3. As ##A_i## and ##B_j## has ##3+3 = 6## components in total (say, in 3-dimension), it turns out that we need only ##6## quantities to represent the ##9## components of the tensor ##T_{ij}##. Is that correct?
To add to Fresh's answer: this last remark already should make you suspicious. An arbitrary (!) second-rank tensor in three dimensions has 3*3=9 components, but two vectors have 2*3=6 components. What you can do, is to decompose a second rank tensor like T_{ij} as

$$T_{ij} = T_{[ij]} + T_{(ij)} $$

where [ij] stands for antisymmetrization, whereas (ij) stands for symmetrization. Both parts transform independently under coordinate transfo's. The antisymmetric part has 3 independent components, whereas the symmetric part has 6 components. You can even go further in this decomposition, because the trace of the tensor components also does not change under a coordinate transformation.
 
  • Like
Likes arpon

Similar threads

  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K