Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
General Math
Calculus
Differential Equations
Topology and Analysis
Linear and Abstract Algebra
Differential Geometry
Set Theory, Logic, Probability, Statistics
MATLAB, Maple, Mathematica, LaTeX
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Mathematics
Linear and Abstract Algebra
Representing conversion of (1,1) tensor to (2,0) tensor
Reply to thread
Message
[QUOTE="Shirish, post: 6013420, member: 647247"] A non-degenerate Hermitian form ##(.|.)## on a vector space ##V## can be identified with a map ##L:V \to V^*## such that ##L(v)=\tilde{v}## and ##\tilde{v}(w) \equiv (v~|~w)##. Suppose we want to convert a vector ##v## to a dual vector ##\tilde{v}##. In terms of matrices, we can just construct the matrix ##[L]## corresponding to the Hermitian form, and hence the map ##L##, by letting ##L_{ij} = (e_i~|~e_j)##. So $$\tilde{v}_j = \tilde{v}(e_j) = (v~|~e_j) = \sum_iv^i(e_i~|~e_j) = \sum_i L_{ij}v^i$$ If ##[v]## and ##[\tilde{v}]## are column vectors containing components of ##v## and ##\tilde{v}##, then ##[\tilde{v}] = [L]^T[v]##. Now I'm trying to apply this whole treatment to the conversion of a ##(1,1)## tensor to a ##(2,0)## tensor. In component representation, the former can be written as ##T_i^{~~j}## and the latter as ##T_{ij}##. From the book I'm reading: > If we have a non-degenerate bilinear form on ##V##, then we may change the type of ##T## by precomposing with the map ##L## or ##L^{-1}##. If ##T## is of type ##(1,1)## with components ##T_i^{~~j}##, for instance, then we may turn it into a tensor ##\tilde{T}## of type ##(2,0)## by defining ##\tilde{T}(v,w) = T(v,L(w))##. Given the basis ##\{e_i\}## of ##V##, we have two choices of bases in the dual space: ##\{e^i\}## where ##e^i(e_j) = \delta^i_j##, or ##\{L(e_i)\}## - the latter being the metric dual basis that depends on the choice of the non-degenerate Hermitian form. [B]What is the appropriate choice of basis in this case?[/B] I need to confirm this because the matrix representations of ##T## and ##\tilde{T}## would depend on it. [B]How do I come up with a matrix representation of the conversion from ##T## to ##\tilde{T}##, as was done in the above example?[/B] ##\tilde{T}(v,w) = T(v,L(w)) \implies T(v,w) = \tilde{T}(v,L^{-1}(w))##. Given that we've decided on the dual basis, then $$\tilde{T}_{ij} = \tilde{T}(e_i,e_j) = T(e_i,L(e_j))$$ $$T_i^{~~j} = T(e_i,e^j) = \tilde{T}(e_i,L^{-1}(e^j))$$ I'm assuming that I'll have to express ##L(e_j)## as a linear combination of dual basis vectors ##e^k##'s, and ##L^{-1}(e^j)## as a linear combination of basis vectors ##e_k##'s, but I'm at a loss on how to do that. That's primarily because in the example I gave above, I was able to express vector/covector components in terms of the other's components, but there's no indication on how to do that with vectors/covectors themselves. Any help would be appreciated. [/QUOTE]
Insert quotes…
Post reply
Forums
Mathematics
Linear and Abstract Algebra
Representing conversion of (1,1) tensor to (2,0) tensor
Back
Top