Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Covariant and Contravariant

  1. Jan 1, 2015 #1
    I am reading a notes about tensor when I came across this which the notes did not elaborate more on it. As a result I dont quite understand why.

    Here it is : " Note that we mark the covariant basis vectors with an upper index and the contravariant basis vectors with a lower index. This may sounds counter-intuitive ('did we not decide to use upper indices for contravariant vectors?') but this is precisely what we mean with the 'different meaning of the indices' here: this time they label the vectors and do not denote their components. "

    I can follow except the last sentence and I do not know why. Can anyone enlighten me?
     
  2. jcsd
  3. Jan 1, 2015 #2

    ShayanJ

    User Avatar
    Gold Member

    Using Einstein summation convention, we have [itex] \vec A=A^i e_i=A^1 e_1+A^2 e_2+A^3 e_3 [/itex]. As you know, the components of mentioned vector([itex]A^i[/itex]), are scalars(they are functions of spatial coordinates). So what makes [itex] \vec A [/itex] a vector? Its the basis vectors [itex] e_i [/itex]. They are really vectors, but basis vectors. Which means only a distinguished set of linearly independent vectors.
     
  4. Jan 1, 2015 #3
    Do you mean that the upper indices assigned for contravariant vector while the lower indices assigned for contravariant basis vector is just a mean to distinguished them from each other? Sorry if I didnt follow you.
     
  5. Jan 1, 2015 #4

    ShayanJ

    User Avatar
    Gold Member

    Yeah, its basically a convention so you shouldn't look for reasons here.
    But what you should understand is that [itex] A^i [/itex] is not a vector. Its only a general term referring to one of the components of the vector [itex] \vec A [/itex] and so its a scalar. Its just that people work with the components of vectors.
     
  6. Jan 1, 2015 #5
    Ok thanks a lot!
     
  7. Jan 1, 2015 #6

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    If you've taken vector calculus, you probably have seen a 2-D vector [itex]\vec{A}[/itex] written in the form [itex]A^x \hat{x} + A^y \hat{y}[/itex]. In that notation, [itex]\hat{x}[/itex] means a "unit vector" in the x-direction, while the coefficient [itex]A^x[/itex] means the component of [itex]\vec{A}[/itex] in that direction. When you get to relativity, the notion of a "unit vector" becomes not well-defined, so the more general notion is a "basis vector". You would write an arbitrary vector [itex]\vec{A}[/itex] in the form [itex]\sum_\mu A^\mu e_\mu[/itex], where the sum ranges over all basis vectors (there are 4 in SR--3 spatial directions and one time direction). By convention, people leave off the [itex]\sum_\mu[/itex] and it's assumed that if an index appears in both lowered and raised forms, then it means that it is summed over. So people would just write a vector as [itex]A^\mu e_\mu[/itex]

    Now, although the components [itex]A^\mu[/itex] are different in different coordinate systems, so people say that the vector "transforms" when you change coordinates, the combination [itex]A^\mu e_\mu[/itex] is actually coordinate-independent. The vector has the same value, as a vector, in every coordinate system. What that means is that if you change coordinates from [itex]x^\mu[/itex] to some new coordinates [itex]x^\alpha[/itex], the value of [itex]\vec{A}[/itex] doesn't change:

    [itex]A^\mu e_\mu = A^\alpha e_\alpha[/itex]

    The components [itex]A^\mu[/itex] change, and the basis vectors [itex]e_\mu[/itex] change, but the combination remains the same.

    We can relate the old and new components through a matrix [itex]L^\alpha_\mu[/itex]:

    [itex]A^\alpha = L^\alpha_\mu A^\mu[/itex]

    If we use this matrix to rewrite [itex]A^\alpha[/itex] in our equation relating the two vectors, we see:

    [itex]A^\mu e_\mu = L^\alpha_\mu A^\mu e_\alpha = A^\mu (L^\alpha_\mu e_\alpha)[/itex]

    Note that since this equation holds for any vector [itex]\vec{A}[/itex], it must mean that

    [itex]e_\mu = L^\alpha_\mu e_\alpha[/itex]

    or if we let [itex](L^{-1})^\mu_\alpha[/itex] be the inverse matrix, we can apply it to both sides to get:

    [itex](L^{-1})^\mu_\alpha e_\mu = e_\alpha[/itex]

    So we have the pair of transformation equations:
    1. [itex]A^\alpha = L^\alpha_\mu A^\mu[/itex]
    2. [itex]e_\alpha = (L^{-1})^\mu_\alpha e_\mu[/itex]
    The basis vectors [itex]e_\mu[/itex] transform in the opposite way from the components [itex]A^\mu[/itex], so that the combination [itex]A^\mu e_\mu[/itex] has the same value in every coordinate system.
     
  8. Jan 1, 2015 #7

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Just another point: the index on a basis vector [itex]e_\mu[/itex] indicates which basis vector, rather which component of a vector. But since a basis vector is, after all, a vector, you can actually ask "what are the components of basis vector [itex]e_\mu[/itex]?" The answer is pretty trivial:

    [itex](e_\mu)^\mu = 1[/itex] (In this case, [itex]\mu[/itex] is NOT summed over)

    All other components are zero. This can be summarized using the delta-notation:

    [itex](e_\mu)^\nu = \delta^\nu_\mu[/itex]
     
  9. Jan 1, 2015 #8

    bcrowell

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I wouldn't refer to the components of a vector as scalars. I would define a scalar as something that doesn't change under a change of coordinates, i.e., a rank-0 tensor.
     
  10. Jan 1, 2015 #9

    ShayanJ

    User Avatar
    Gold Member

    Oh...yeah, Sorry. I should have made clear I don't mean the strict meaning of the word.
    So...what should we call them? Just "components of a vector"?

    EDIT: But actually in the context of linear algebra, they are scalars. So we have two conflicting definitions of the word scalar.
     
    Last edited: Jan 1, 2015
  11. Jan 1, 2015 #10

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    I guess it's a matter of taste, but I don't like that way of describing things. If [itex]\vec{A}[/itex] and [itex]\vec{B}[/itex] are vectors, then wouldn't you say that [itex]\vec{A} \cdot \vec{B}[/itex] is a scalar? But in the special case where [itex]\vec{B}[/itex] is the basis vector [itex]e_\mu[/itex], we have:

    [itex]\vec{A} \cdot e_\mu = A_\mu[/itex]

    So it is simultaneously true that [itex]A_\mu[/itex] is a scalar (it is the result of taking the scalar product of two vectors), and it is also a component of a covector.
     
  12. Jan 1, 2015 #11

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The components of a vector ##v## with respect to an ordered basis ##(e_1,\dots,e_n)## are the unique real numbers ##v^1,\dots,v^n## such that ##v=\sum_{i=1}^n v^i e_i##.

    I will elaborate a bit...

    Let ##V## be an n-dimensional vector space over ##\mathbb R##. Let ##V^*## be the set of linear functions from ##V## to ##\mathbb R##. Define addition and scalar multiplication on ##V^*## by ##(f+g)(v)=f(v)+g(v)## and ##(vf)(x)=a(f(v))## for all ##v\in V##. These definitions turn ##V^*## into a vector space. The ##V^*## defined this way is called the dual space of ##V##.

    Let ##(e_i)_{i=1}^n## be an ordered basis for ##V##. (The notation denotes the n-tuple ##(e_1,\dots,e_n)##). It's conventional to put these indices downstairs, and to put the indices on components of vectors in ##V## upstairs. For example, if ##v\in V##, then we write ##v=v^i e_i##. I'm using the summation convention here, so the right-hand side really means ##\sum_{i=1}^n v^i e_i##.

    For each ##i\in\{1,\dots,n\}##, we define ##e^i\in V^*## by ##e^i(e_j)=\delta^i_j##. It's not hard to show that ##(e^i)_{i=1}^n## is an ordered basis for ##V^*##. The ordered basis ##(e^i)_{i=1}^n## is called the dual basis of ##(e_i)_{i=1}^n##. It's conventional to put the indices on components of vectors in ##V^*## downstairs. For example, if ##f\in V^*##, then we write ##f=f_ie^i##.

    Exercise: Find an interesting way to rewrite each of the following expressions:

    a) ##e^i(v)##
    b) ##f(e_i)##
     
    Last edited: Jan 1, 2015
  13. Jan 1, 2015 #12

    bcrowell

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I suppose this would depend on whether you use the convention that a basis vector like [itex]e_\mu[/itex] transforms, or doesn't transform. I would take the Greek index to mean that it's a concrete index rather than an abstract index, and I would then assume that it was to be kept fixed under a change of coordinates. In reality, I think this would usually be clear from context.
     
  14. Jan 1, 2015 #13

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    The way that I think of things "transforming under coordinate changes" is this:

    Vectors are fixed things (in differential geometry, they can be identified with tangents to parametrized paths). Components of a vector are projections of the vector onto a basis (as set of 4 independent vectors). If I have 4 independent vectors [itex]\vec{A}, \vec{B}, \vec{C}, \vec{D}[/itex], and then I have another vector [itex]\vec{V}[/itex], I can, as Fredrick said, write [itex]\vec{V}[/itex] as a linear combination of my basis: [itex]\vec{V} = V^1 \vec{A} + V^2 \vec{B} + V^3 \vec{C} + V^4 \vec{D}[/itex]. [itex]V^1, ..., V^4[/itex] are just 4 real numbers that happen to express the relationship between [itex]\vec{V}[/itex] and my four basis vectors, [itex]\vec{A}, \vec{B}, \vec{C}, \vec{D}[/itex]. At this point, nothing has been said about a coordinate system. All 5 vectors, [itex]\vec{V},\vec{A}, ..., \vec{D}[/itex] have an identity that is independent of any coordinate system.

    But if I want to use a different set of vectors as my basis, say, [itex]\vec{A'}, \vec{B'}, ..., \vec{D'}[/itex], then I can also write the same vector [itex]\vec{V}[/itex] in terms of this new basis: [itex]\vec{V} = (V^1)' \vec{A'} + ... + (V^4)' \vec{D'}[/itex]. I haven't transformed [itex]\vec{V}[/itex], I've just written it as a different linear combination.
     
  15. Jan 1, 2015 #14

    ShayanJ

    User Avatar
    Gold Member

    Here you're talking about linear algebra. I'm wondering how the two view points can be reconciled!
    mmm...It seems to me that in linear algebra we never use different coordinates, in fact we never define such things. We just pick different sets of linearly independent vectors as bases. So...Yeah, what you're talking here, is just in the tangent space of a point. But in differential geometry where we use different coordinates, we're doing things in a much less local manner than being only at a point.
     
  16. Jan 2, 2015 #15

    pervect

    User Avatar
    Staff Emeritus
    Science Advisor

    In differential geometry, we have the tangent and cotangent spaces at a point - but we also usually have some additional structure, such as the connection / fibre bundle that defies a map from the tangent space at one point to the tangent space at another point, given a curve connecting the two points.
     
  17. Jan 2, 2015 #16

    Matterwave

    User Avatar
    Science Advisor
    Gold Member

    At any point ##P## on a manifold, the tangent and cotangent spaces are simply linear vector spaces, that's why we "talk about linear algebra" even when we are talking about differential geometry. However, there are "special" sets of basis vectors ##e_i,~~i=1,...,n## which are called "coordinate basis vectors" if they all satisfy ##[e_i,e_j]=0,~~\forall i,j##. When we transform from one set of coordinate basis vectors to another set of coordinate basis vectors, the components of vectors or one forms change in the usual fashion ##A^{i'}=\frac{\partial x^{i'}}{\partial x^j}A^j##.
     
  18. Jan 2, 2015 #17

    ShayanJ

    User Avatar
    Gold Member

    Yeah, I know. But here, the question is can we call components of a vector, scalars?
    As bcrowell said, it seems wrong because the components change when we do a coordinate transformation and so aren't invariant under coordinate transformations, as scalars should be!
    But as stevendaryl said, [itex] \vec A \cdot \vec B [/itex] is a scalar, and if we put [itex] \vec B=\hat e_i [/itex], we get [itex] \vec A \cdot \hat e_i=A^i [/itex] and it seems components of vectors are actually scalars.
    (Maybe they actually settled the issue but I didn't understand!!!)
     
  19. Jan 2, 2015 #18

    Matterwave

    User Avatar
    Science Advisor
    Gold Member

    This is a matter of terminology. The number ##A^i \equiv \vec{A}\cdot\vec{e_i}## is a scalar field certainly; however, if we view ##A^i## as "the i'th component of the vector ##\vec{A}##" then certainly it is a component of a vector and not a scalar. In other words, it depends on how you want to view the quantity ##A^i##. If you view it as "the i'th component of A in THIS PARTICULAR basis" then it is a scalar, if you view it as "the i'th component of A in SOME basis" then it is not a scalar.

    Perhaps it's easier if we give a concrete example. Say we have a vector ##\vec{A}=(3,2,0)##, then ##A^1=3##. 3, being a number, is a scalar of course, but ##A^1## which we use to denote what is in the first slot of ##\vec{A}=(A^1,\quad,\quad)## is the component of a vector.
     
  20. Jan 2, 2015 #19

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Right. In a coordinate basis, we're picking the basis vectors vectors in a way that relates to the coordinates: [itex]e_\mu[/itex] is that unique vector so that [itex](e_\mu \cdot \nabla) \Phi = \partial_\mu \Phi[/itex] for all scalar fields [itex]\Phi[/itex]. But that's just a particular (very convenient) way of picking a basis. The basis doesn't have to have anything to do with coordinates. (Of course, to be useful, you need some continuous way to pick a basis at every point.)
     
  21. Jan 2, 2015 #20

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Yeah, physics discussions (and mathematics discussions aren't much better, often) sometimes run into confusion when it's not clear whether someone is talking about a tensor (or matrix, or vector, or whatever), or whether someone is talking about a component of a tensor (with arbitrary indices).

    For example, [itex]g_{\mu \nu}[/itex] might mean the metric tensor, or they might mean a particular component of the metric tensor.

    There is a similar ambiguity when people talk about functions: Does [itex]f(x)[/itex] mean a function, or does it mean the value of the function at some point [itex]x[/itex]? Many people try to use different alphabets, or different fonts, or something to distinguish between a variable and a constant with an arbitrary value, so they might write [itex]f(x)[/itex] to mean the function and [itex]f(a)[/itex] to mean the value at point [itex]a[/itex]. But it's hard to be consistent about such conventions, and not everybody uses the same ones.

    You can disambiguate by using lambda notation (or some equivalent "binding" mechanism):

    [itex]\lambda x . f(x)[/itex] means the function, while [itex]f(x)[/itex] means its value at point [itex]x[/itex]. But it's a pain to make everything explicit that way.
     
  22. Jan 2, 2015 #21

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    The use of lowered-indices to indicate basis vectors, such as [itex]e_\mu[/itex], is a little more profound than simply keeping track of indices to apply the Einstein summation convention. It's also the case that in a change of basis, the basis vectors and the components of a single vector transform in opposite way:

    [itex]A^\alpha = L^\alpha_\mu A^\mu[/itex]
    [itex]e_\alpha = (L^{-1})^\mu_\alpha e_\mu[/itex]

    For a covector [itex]B = B_\alpha e^\alpha[/itex], where [itex]e^\alpha[/itex] is a basis of covectors, it works out the opposite way:

    [itex]B_\alpha = (L^{-1})^\mu_\alpha e_\mu[/itex]
    [itex]e^\alpha = L^\alpha_\mu e^\mu[/itex]


    I'm not sure I know of a pithy way to see that an indexed collection of basis vectors should transform like the components of a covector, and an indexed collection of basis covectors should transform like the components of a vector. It has to work out that way in order for the Einstein summation convention to produce an object that is basis-independent, but I don't know a satisfying explanation for why it should work that way.
     
  23. Jan 2, 2015 #22
    Schutz has a pretty good explanation of this.
    You just have to postulate that the basis vectors transform linearly, and then you can show that the components transform via the inverse matrix in order to preserve the vector/covector.
    I guess it all comes from the fact that the vectors/covectors are geometrical objects? Not sure if everyone would find that a satisfying motivation.
     
  24. Jan 2, 2015 #23

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    It's not a postulate. If ##(e_i)_{i=1}^n## and ##(e_i')_{i=1}^n## are ordered bases for ##V##, then for all ##i##, there must exist numbers ##M_i^j## such that ##e_i'=M_i^j e_j##. (In other words, we can always write the new basis vectors as linear combinations of the old).
     
  25. Jan 2, 2015 #24

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I will elaborate on what I said in post #23 here, in order to answer the question of how things transform under a change of ordered basis. Some of this was already worked out by stevendaryl in post #6. ##V## denotes an arbitrary n-dimensional vector space over ##\mathbb R##. ##V^*## denotes its dual space. (See post #11 if that term is unfamiliar).

    Now let ##M## be the matrix such that for all ##i,j##, the component on row ##i##, column ##j## is ##M^i_j##. Recall that the definition of matrix multiplication is ##(AB)^i_j=A^i_k B^k_j##. Let ##v\in V## be arbitrary. We have
    $$v=v^j e_j=v^i{}' e_i{}' =v^i{}' M^j_i e_j,$$ and therefore ##v^j=v^i{}' M^j_i##. This implies that
    $$(M^{-1})^k_j v^j =v^i{}' (M^{-1})^k_j M^j_i =v^i{}' (M^{-1}M)^k_i =v^i{}' \delta^k_i =v^k.$$ So the n-tuple of components ##(v^1,\dots,v^n)## transforms according to
    $$v^i= (M^{-1})^i_j v^j.$$ The fact that the matrix that appears here is ##M^{-1}## rather than ##M## is the reason why an n-tuple of components of an element of ##V## is said to transform contravariantly. The terms "covariant" and "contravariant" should be interpreted respectively as "the same as the ordered basis" and "the opposite of the ordered basis".

    It's easy to see that the dual basis transforms contravariantly. Let ##N## be the matrix such that ##e^i{}' =N^i_j e^j##. We have
    $$\delta^i_j =e^i{}'(e_j{}')=N^i_k e^k (M_j^l e_l) = N^i_k M_j^l e^k{}(e_l{}) =N^i_k M_j^l \delta^k_l =N^i_k M_j^k =(NM)^i_j.$$ This implies that ##N=M^{-1}##. So we have
    $$e^i{}' =(M^{-1})^i_j e^j.$$ Now we can easily see that an n-tuple of components of an arbitrary ##f\in V^*## transforms covariantly. We can prove it in a way that's very similar to how we determined the transformation properties of the n-tuple of components of ##v##, but the simplest way is to use the formula ##f_i=f(e_i)##, which I left as an easy exercise in post #11.
    $$f_i{}' =f(e_i{}')=f(M_i^j e_j) =M_i^j f(e_j)= M_i^j f_j.$$ Note that what's "transforming" under a change of ordered basis in these examples are n-tuples of real numbers or n-tuples of vectors (in ##V## or ##V^*##). In the case of a tensor of type ##(k,l)##, what's transforming isn't the tensor, but its ##n^{k+l}##-tuple of components with respect to the ordered basis ##(e_i)_{i=1}^n##.

    Of course, one can take the point of view that these ##n##-tuples or ##n^{k+l}##-tuples are the tensors, or rather, that the function that associates tuples with ordered bases is what should be called a tensor. I'm not a fan of that view myself. I consider it inferior and obsolete. However, there isn't anything fundamentally wrong with it. The real problem is that it's so hard to find an explanation of this view that isn't unbelievably bad.
     
    Last edited: Jan 2, 2015
  26. Jan 2, 2015 #25
    I worded that badly.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook