Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear transformations as tensor.

  1. Feb 26, 2012 #1
    I was looking at this table here: http://en.wikipedia.org/wiki/Tensor#Examples

    And i didn't understand why a (1,1) tensor is a linear transformation, I was wondering if someone could explain why this is.

    A (1,1) tensor takes a vector and a one-form to a scalar.
    But a linear transformation takes a vector to a vector.

    Thanks.
     
  2. jcsd
  3. Feb 26, 2012 #2

    dx

    User Avatar
    Homework Helper
    Gold Member

    A bilinear form can take a vector to a vector or a covector to a covector:

    Auvζv = γu
    Auvωu = κv
     
  4. Feb 26, 2012 #3
    I thought a bilinear form was the tensor product of 2 one-forms/linear functionals, so it would take two vectors to a scalar.
     
  5. Feb 26, 2012 #4

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If your form is T(u,v) where u is a vector and v is a co-vector, consider the function f(u)=T(u,-). f(u) is a function which, when you plug in a co-vector you get a scalar. Those are exactly what vectors are (co-co vectors)
     
  6. Feb 28, 2012 #5

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper

    Assume all spaces are finite diml.

    by plugging into one variable at a time, you see that bilinear maps from VxW to U is the same as linear maps from V to linear maps from W to U.

    I.e. Bil(VxW,U) ≈ Lin(V,Lin(W,U)).

    Write V* for linear maps from V to k, where k is the scalar field.
    For finite diml spaces then Lin(Lin(V,k),k) = V** ≈ V.

    essentially by definition of tensors, V*tensorW* ≈ (VtensorW)* ≈ Bil(VxW,k).

    Thus V*tensorW ≈ Bil(VxW*,k) ≈ Lin(V,W**) ≈ Lin(V,W).

    I.e. if ftensorw is an elementary (1,1) tensor in V*tensW,

    then it maps the vector v in V to the vector

    f(v).w in W. A general (1,1) tensor is a sum of such elementary ones and maps v to the corresponding sum of vectors in W.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Linear transformations as tensor.
  1. Linear transformation (Replies: 4)

  2. Linear transformation (Replies: 4)

  3. Linear transformations (Replies: 6)

  4. Linear transformation (Replies: 5)

Loading...