Representing conversion of (1,1) tensor to (2,0) tensor

  • Context: Undergrad 
  • Thread starter Thread starter Shirish
  • Start date Start date
  • Tags Tags
    Tensor
Click For Summary
SUMMARY

The discussion focuses on converting a (1,1) tensor, represented as T_i^{~~j}, to a (2,0) tensor, represented as T_{ij}, using a non-degenerate Hermitian form and the associated linear map L: V → V*. The conversion is achieved by defining the new tensor \tilde{T}(v,w) = T(v,L(w)). The participants emphasize the importance of choosing the correct basis in the dual space, either {e^i} or {L(e_i)}, which directly affects the matrix representations of T and \tilde{T}. The discussion concludes with a need for clarity on expressing L(e_j) and L^{-1}(e^j) in terms of the respective basis vectors.

PREREQUISITES
  • Understanding of non-degenerate Hermitian forms
  • Familiarity with tensor notation and types, specifically (1,1) and (2,0) tensors
  • Knowledge of linear maps and their matrix representations
  • Proficiency in working with dual spaces and basis transformations
NEXT STEPS
  • Study the properties of non-degenerate Hermitian forms in linear algebra
  • Learn about tensor transformations and the implications of changing tensor types
  • Explore the concept of dual bases and their applications in tensor analysis
  • Investigate matrix representations of linear maps and their role in tensor conversion
USEFUL FOR

Mathematicians, physicists, and engineers working with tensor calculus, particularly those involved in differential geometry and theoretical physics.

Shirish
Messages
242
Reaction score
32
A non-degenerate Hermitian form ##(.|.)## on a vector space ##V## can be identified with a map ##L:V \to V^*## such that ##L(v)=\tilde{v}## and ##\tilde{v}(w) \equiv (v~|~w)##.

Suppose we want to convert a vector ##v## to a dual vector ##\tilde{v}##. In terms of matrices, we can just construct the matrix ##[L]## corresponding to the Hermitian form, and hence the map ##L##, by letting ##L_{ij} = (e_i~|~e_j)##. So

$$\tilde{v}_j = \tilde{v}(e_j) = (v~|~e_j) = \sum_iv^i(e_i~|~e_j) = \sum_i L_{ij}v^i$$

If ##[v]## and ##[\tilde{v}]## are column vectors containing components of ##v## and ##\tilde{v}##, then ##[\tilde{v}] = [L]^T[v]##.

Now I'm trying to apply this whole treatment to the conversion of a ##(1,1)## tensor to a ##(2,0)## tensor. In component representation, the former can be written as ##T_i^{~~j}## and the latter as ##T_{ij}##. From the book I'm reading:

> If we have a non-degenerate bilinear form on ##V##, then we may change the type of ##T## by precomposing with the map ##L## or ##L^{-1}##. If ##T## is of type ##(1,1)## with components ##T_i^{~~j}##, for instance, then we may turn it into a tensor ##\tilde{T}## of type ##(2,0)## by defining ##\tilde{T}(v,w) = T(v,L(w))##.

Given the basis ##\{e_i\}## of ##V##, we have two choices of bases in the dual space: ##\{e^i\}## where ##e^i(e_j) = \delta^i_j##, or ##\{L(e_i)\}## - the latter being the metric dual basis that depends on the choice of the non-degenerate Hermitian form. What is the appropriate choice of basis in this case? I need to confirm this because the matrix representations of ##T## and ##\tilde{T}## would depend on it.

How do I come up with a matrix representation of the conversion from ##T## to ##\tilde{T}##, as was done in the above example? ##\tilde{T}(v,w) = T(v,L(w)) \implies T(v,w) = \tilde{T}(v,L^{-1}(w))##. Given that we've decided on the dual basis, then

$$\tilde{T}_{ij} = \tilde{T}(e_i,e_j) = T(e_i,L(e_j))$$

$$T_i^{~~j} = T(e_i,e^j) = \tilde{T}(e_i,L^{-1}(e^j))$$

I'm assuming that I'll have to express ##L(e_j)## as a linear combination of dual basis vectors ##e^k##'s, and ##L^{-1}(e^j)## as a linear combination of basis vectors ##e_k##'s, but I'm at a loss on how to do that. That's primarily because in the example I gave above, I was able to express vector/covector components in terms of the other's components, but there's no indication on how to do that with vectors/covectors themselves. Any help would be appreciated.
 
Physics news on Phys.org
Shirish said:
A non-degenerate Hermitian form ##(.|.)## on a vector space ##V## can be identified with a map ##L:V \to V^*## such that ##L(v)=\tilde{v}## and ##\tilde{v}(w) \equiv (v~|~w)##.

Suppose we want to convert a vector ##v## to a dual vector ##\tilde{v}##. In terms of matrices, we can just construct the matrix ##[L]## corresponding to the Hermitian form, and hence the map ##L##, by letting ##L_{ij} = (e_i~|~e_j)##. So

$$\tilde{v}_j = \tilde{v}(e_j) = (v~|~e_j) = \sum_iv^i(e_i~|~e_j) = \sum_i L_{ij}v^i$$

If ##[v]## and ##[\tilde{v}]## are column vectors containing components of ##v## and ##\tilde{v}##, then ##[\tilde{v}] = [L]^T[v]##.

Now I'm trying to apply this whole treatment to the conversion of a ##(1,1)## tensor to a ##(2,0)## tensor. In component representation, the former can be written as ##T_i^{~~j}## and the latter as ##T_{ij}##. From the book I'm reading:

> If we have a non-degenerate bilinear form on ##V##, then we may change the type of ##T## by precomposing with the map ##L## or ##L^{-1}##. If ##T## is of type ##(1,1)## with components ##T_i^{~~j}##, for instance, then we may turn it into a tensor ##\tilde{T}## of type ##(2,0)## by defining ##\tilde{T}(v,w) = T(v,L(w))##.

Given the basis ##\{e_i\}## of ##V##, we have two choices of bases in the dual space: ##\{e^i\}## where ##e^i(e_j) = \delta^i_j##, or ##\{L(e_i)\}## - the latter being the metric dual basis that depends on the choice of the non-degenerate Hermitian form. What is the appropriate choice of basis in this case? I need to confirm this because the matrix representations of ##T## and ##\tilde{T}## would depend on it.

How do I come up with a matrix representation of the conversion from ##T## to ##\tilde{T}##, as was done in the above example? ##\tilde{T}(v,w) = T(v,L(w)) \implies T(v,w) = \tilde{T}(v,L^{-1}(w))##. Given that we've decided on the dual basis, then

$$\tilde{T}_{ij} = \tilde{T}(e_i,e_j) = T(e_i,L(e_j))$$

$$T_i^{~~j} = T(e_i,e^j) = \tilde{T}(e_i,L^{-1}(e^j))$$

I'm assuming that I'll have to express ##L(e_j)## as a linear combination of dual basis vectors ##e^k##'s, and ##L^{-1}(e^j)## as a linear combination of basis vectors ##e_k##'s, but I'm at a loss on how to do that. That's primarily because in the example I gave above, I was able to express vector/covector components in terms of the other's components, but there's no indication on how to do that with vectors/covectors themselves. Any help would be appreciated.
If I understood you correctly, then you çan express in terms of components and then use multivibrator to find components. I
 
WWGD said:
If I understood you correctly, then you çan express in terms of components and then use multivibrator to find components. I
Your reply got cut off, I think. Maybe a bug but I can't see your whole answer.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 58 ·
2
Replies
58
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K