Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Confusion about basis vectors and matrix tensor

  1. Apr 23, 2014 #1
    In "A Student's Guide to Vectors and Tensors" by Daniel Fleisch, I read that the covariant metric tensor gij=ei°ei (I'm leaving out the → s above the e's) where ei and ei are coordinate basis vectors and ° denotes the inner product, and similarly for the contravariant metric tensor using dual base vectors. But I thought the definition of base vectors included that the inner product of two distinct ones of the same type was zero, and similarly for dual base vectors. (For example, the basis vectors of Rn, with the inner product = the dot product.) Where is my thinking wrong?
  2. jcsd
  3. Apr 23, 2014 #2
    All you need for a basis is linear independence. If you also have orthogonality -- i.e., if [itex]e_i \cdot e_j = 0[/itex] when [itex]i \ne j[/itex] -- that can be convenient, but it's not necessary.

    The dual basis is the tool we use to get this same convenience (zero dot products) with an arbitrary basis. The main relation here is:
    e_i \cdot e^j = \delta_i^j
    where [itex]\delta_i^j[/itex] is the Kronecker delta symbol (1 if [itex]i=j[/itex]; 0 otherwise).
  4. Apr 24, 2014 #3
    Ah, thanks, it did not occur to me that, although orthogonality implied linear independence, the converse is not true. This spurred me to try some examples, and I see that one could make a basis of independent vectors {(1,1), (1,0)} to span R2 although they are not orthogonal, having a dot product of 1. OK, case closed. Thanks again.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Confusion about basis vectors and matrix tensor
  1. Matrix and basis (Replies: 1)

  2. Basis of vectors. (Replies: 6)