Is a vector itself contra/covariant or just its components?

In summary, the conversation discusses the concept of contra/covariance in relation to vectors and whether it is a property of the vector itself or its components with respect to a basis. One argument is that the dual space, consisting of linear functionals on the original vector space, is important to consider when discussing contra/covariance. However, the speaker is dismissive of this idea and prefers to define reciprocal bases within the same vector space. The other speaker provides examples of how the dual space is necessary in certain contexts, such as quantum mechanics and general relativity.
  • #1
Hiero
322
68
It seems most people say that a vector is either contravariant or covariant. To me it seems like contra/covariance is a property of the components of a vector (with respect to some basis) and not of the vector itself.

Any basis {bi} has a reciprocal basis {bi} and any vector can be expressed with respect to either basis, ## v = v^i b_i = v_i b^i## (sums on i).

So then it seems a vector is just a vector and it’s the components which are either covariant or contravariant (wrt to the original basis)

One argument I heard is that the covariant vector somehow lives in some other (“dual”) space. This seems silly to me because I see no reason why the reciprocal basis can’t be taken to span the exact same vector space as the original basis.

What are your thoughts? Is there a good reason why I should consider contra/covariance to be a property of the vector itself?
 
Mathematics news on Phys.org
  • #2
Hiero said:
One argument I heard is that the covariant vector somehow lives in some other (“dual”) space. This seems silly to me because I see no reason why the reciprocal basis can’t be taken to span the exact same vector space as the original basis.

Vectors and covectors do occupy completely different vector spaces, so the dual basis cannot span ##V##, and vice versa. I think, formally, it's incorrect to say something like ##v = v^i b_i = v_i b^i##, because like you mentioned ##v^i b_i \in V## whilst ##v_i b^i \in V^*##. However, I suspect what's really going on is that we're using some isomorphism ##V \cong V^*## to identify each vector ##\alpha \in V## with its dual ##\beta \in V^*##.
 
  • Like
Likes PeroK
  • #3
In the finite dimensional case there are a lot of isomorphisms between ##V## and ##V^*## and no one of them is distinguished. If the space ##V## has an inner product then there is a canonical isomorphism
 
  • Like
Likes lurflurf and etotheipi
  • #4
@etotheipi @wrobel

But why should I view the reciprocal basis as occupying a different space?

What is wrong with saying the reciprocal basis is defined by ##b_i\cdot b^j = \delta _i^j##? Whats the point of introducing a dual space?
 
  • #5
The space ##V^*## is the space of linear functionals ##\omega: V \rightarrow \mathbb{F}## on ##V##. To find the basis ##\beta^* = \{\omega^1, \omega^2, \dots, \omega^n \}## of ##V^*## dual to the basis ##\beta = \{ v_1, v_2, \dots, v_n \}## of ##V##, we assert that ##\omega^i (v_j) = \delta^i_j##.

You can, for instance, construct an isomorphism so that for each vector ##u \in V## there is an associated functional ##\omega_u(\cdot) \in V^*## such that ##\omega_u(v) = \langle u, v \rangle## for an arbitrary ##v \in V##. That means you can now start to identify the dual basis with their respective duals in ##V##.
 
Last edited by a moderator:
  • #6
Hiero said:
@etotheipi @wrobel

But why should I view the reciprocal basis as occupying a different space?

What is wrong with saying the reciprocal basis is defined by ##b_i\cdot b^j = \delta _i^j##? Whats the point of introducing a dual space?
You can probably go a long way in physics without worrying about the dual space as being anything other than the original space in a different basis. In general, however, for infinite dimensional vector spaces there is not necessarily an isomorphism between the two. In which case, the dual space is not the same as the original vector space.
 
  • Like
Likes Hiero and etotheipi
  • #7
Thanks for your time,

I still don’t see a reason to introduce the dual space of linear functionals. I suppose I’m hard-headed, but I don’t like to entertain formalities which seem unnecessary. I will continue to disregard it in favor of my definition of reciprocal bases over the same vector space. It is not hard to show that it always exists (there is a general way to construct it) for finite dimensional spaces. Of course, infinite dimensional vector spaces are more nuanced and I should not assume (my meaning of) reciprocal bases exist in those cases. Maybe one day I will understand the purpose of dual spaces but until then I’m dismissing it.

Take care.
 
  • Sad
Likes weirdoguy and PeroK
  • #8
@Hiero I don't know very much about it either, but you'll eventually need to worry about them when it comes to QM and GR. To mention a few examples, in QM the kets ##\langle \alpha| \in \mathcal{H}^*## are linear functionals on ##\mathcal{H}##, and there is a dual correspondence via. the Riesz theorem i.e. ##\langle \alpha | \overset{DC}{\leftrightarrow} | \alpha \rangle## between the bras and kets, in the same way as described above. When you write ##\langle \alpha | \beta \rangle##, that's the action of a functional on a ket which, due to how the isomorphism is constructed, equals the inner product ##\langle \alpha, \beta \rangle## where both ##\alpha, \beta \in \mathcal{H}##.

I know even less about GR, but I'm aware it's pretty important to distinguish between the vector and covector arguments to a tensor, with a tensor ##T## being a multilinear map ##T: \underbrace{T_p^* \times \dots}_\text{m times} \, \times \underbrace{T_p \times \dots}_\text{n times} \mapsto \mathbb{R}##, where the tensor basis consists of possible outer products of the basis vectors from both spaces [i.e. we can think of the ##e_i## as linear functionals on ##V^*##, since there's a natural map from ##V^{**}## to ##V##!], i.e. tensor can be expressed as a summation ##T = {T_{i_1, \dots, i_n}}^{j_1, \dots, j_m}e^{i_1} \otimes \dots \otimes e^{i_n} \otimes e_{j_1} \otimes \dots \otimes e_{j_m}##. There's a much better explanation in Thorne/Blandford.

Point is that the dual space becomes pretty important!
 
Last edited by a moderator:
  • Like
Likes Hiero
  • #9
@etotheipi

Thanks for the context. I’ll have to get familiar with such ideas soon enough. Nonetheless, just because something is standard does not mean it is necessary! General Relativity can be done without tensors and quantum mechanics can be done without Hilbert spaces (no link for that claim because there are various approaches). Of course, I do not understand these approaches, so I‘m not sure of their merit, if any.

Anyway, like I said, I’m just being hard headed. I’m not fond of the standard formalisms but I’m sure I’ll eventually come around to respect and appreciate them. Most popular ideas are popular for good reason. I just have to see why for myself before I agree o0)

Your pup looks like (one of) mine :)

Take care, my friend.
 
  • Sad
Likes weirdoguy and PeroK
  • #10
1601622691288.png
 
  • #11
Hiero said:
But why should I view the reciprocal basis as occupying a different space?

Because there is no canonical isomorphism between ##V## and ##V^*## and that means that there is no way to unambiguously identify elements of those spaces. Ergo these spaces are different. There are isomophisms that are non-canonical, ie. basis dependent, but these are useles as you will see soon. If you identfiy ##a## with ##b## and change basis then this identification is lost. In new basis ##a## is identified with ##c## which in general is different from ##b##. Do you really think that's the easy way of doing things?

Hiero said:
and any vector can be expressed with respect to either basis, ##v = v^i b_i = v_i b^i## (sums on i).

No it can't.

Hiero said:
What is wrong with saying the reciprocal basis is defined by ##b_i\cdot b^j = \delta _i^j##?

And what does that dot even mean? What does this definition mean? ##b^i## is a function, and if you want to treat ##V## as ##V^*## then ##b_i## is also a function. How does it act on general elements of ##V## or ##V^*##?

Hiero said:
This seems silly to me because I see no reason why the reciprocal basis can’t be taken to span the exact same vector space as the original basis.

Then show us how you write out ##b^i## as a linear combination of ##b_j## and vice versa.

Hiero said:
I will continue to disregard it in favor of my definition of reciprocal bases over the same vector space.

You didn't show that your definition is correct for a general vector space. You didn't even explain your definition, you just showed a string of symbols. That is not a way to do maths, sorry.

Hiero said:
Nonetheless, just because something is standard does not mean it is necessary!

Just because you don't understand something does not mean it's unncessary. You are just making your own maths up. And PF is not a good place for personal theories.
 
Last edited:
  • Like
Likes etotheipi
  • #12
PeroK said:
You can probably go a long way in physics without worrying about the dual space as being anything other than the original space in a different basis.

Yes but that's because when it comes to general physics physicists use only Euclidean vector and affine spaces which by definitions (well, at least definitions in my books) are spaces with inner product defined on them. And inner product makes a hella lot of a difference because, as @wrobel said, it gives us a canonical isomorphism between vector space and it's dual. Euclidean spaces are evil o0) They have so many algebraic and geometric structures on them that when people try to do more abstract math they get confused like in this example.
 
Last edited:
  • #13
weirdoguy said:
Because there is no canonical isomorphism between ##V## and ##V^*## and that means that there is no way to unambiguously identify elements of those spaces. Ergo these spaces are different. There are isomophisms that are non-canonical, ie. basis dependent, but these are useles as you will see soon. If you identfiy ##a## with ##b## and change basis then this identification is lost. In new basis ##a## is identified with ##c## which in general is different from ##b##. Do you really think that's the easy way of doing things?
No it can't.
And what does that dot even mean? What does this definition mean? ##b^i## is a function, and if you want to treat ##V## as ##V^*## then ##b_i## is also a function. How does it act on general elements of ##V## or ##V^*##?
Then show us how you write out ##b^i## as a linear combination of ##b_j## and vice versa.
You didn't show that your definition is correct for a general vector space. You didn't even explain your definition, you just showed a string of symbols. That is not a way to do maths, sorry.
Just because you don't understand something does not mean it's unncessary. You are just making your own maths up. And PF is not a good place for personal theories.
Why so hostile?

To you b^i is a function but I’m not defining it like that. I’m defining b^i as a vector in V according to ##b_j \cdot b^i=##{1 if i=j, otherwise 0} ... why is that hard to understand? It’s simply another basis for V which satisfies that property (and frankly it’s a useful basis when considering curvilinear coordinates). Yes it always exists. If you really want an explicit construction:
$$b^k=(-1)^{k-1}(b_1 \wedge\dots \wedge b_{k-1} \wedge b_{k+1}\wedge \dots \wedge b_n)\frac{(b_n \wedge\dots\wedge b_1)}{|b_1 \wedge\dots\wedge b_n|^2}$$
But that’s besides the point; I don’t want to get into the definitions of multiplication(s) and norm used above. (The formalism is called “Geometric Algebra” if you want to see the definitions, but it’s not conceptually necessary; I just use it for the explicit construction.)

I don’t see how using the phrase ‘reciprocal basis’ to refer to a different basis of the same space is somehow “making my own maths up.” Maybe I should call it something else, like the ‘complementary basis’ or something. Does that make you feel better?

I just don’t see any point in using the function definition of b^i, and so I won’t. Notation is just notation; I can use it however I like. Sorry if that bothers you, but that IS math; notation is context dependent.

You could’ve just explained why dual spaces are an important concept (preferably without all the implicit disrespect).
 
  • #14
Hiero said:
To you b^i is a function but I’m not defining it like that.

But you are asking about covariant and contravariant vectors which by definition belong to different spaces. Dual space by definition is a space of functions. So in the context of your question ##b^i## is an element of ##V^*## and by definition it's a function.

Hiero said:
I’m defining b^i as a vector in V according to ##b_j \cdot b^i=\delta ^i_j## {1 if i=j, otherwise 0} ... why is that hard to understand?

But by writing that out you are using additional structures! You are using inner product. There is no inner or any other product in a general setting. So you can't define ##b^i## that way. Covariant vectors by definition lie in ##V^*## and contravariant in ##V##. You said:

Hiero said:
This seems silly to me because I see no reason why the reciprocal basis can’t be taken to span the exact same vector space as the original basis.

and you "see no reason" because you implicitly assume that you are working with vector spaces that have additional structures. This is not the case in the general context of covariant and contravariant distinction.

Hiero said:
Yes it always exists.

It does not if you don't have any additional structures, and that is the setting in which co- and contravariant vectors are defined.

Hiero said:
I don’t want to get into the definitions of multiplication(s) and norm used above.

But you have no norms and multiplications to begin with!

Hiero said:
I don’t see how using the phrase ‘reciprocal basis’ to refer to a different basis of the same space is somehow “making my own maths up.”

You missed the point. You said that talking about dual spaces is silly because you see no reason why the reciprocal basis can’t be taken to span the exact same vector space. You got anwers and yet you decided to dismiss them and stay with your approach, which is not wrong in itself but totally misses the point of this whole thread. And saying that dual spaces are silly because you don't know the whole picture is well... a little bit arrogant?

Hiero said:
I just don’t see any point in using the function definition of b^i, and so I won’t

So to sum up: you think in terms of vector spaces with additional structures which makes it hard to really grasp the need for dual spaces. As soon as you will learn linear algebra you will see that the definition of covariant and contravariant vectors relies on vector spaces without any additional structures so your ##b^i## cannot be non-ambiguously identified with anything that lies in ##V## so you will be forced to think about ##V^*## as something different than ##V##.
 
Last edited:
  • Like
Likes etotheipi
  • #15
@weirdoguy

I am aware I am using the extra structure of an inner-product space. Maybe I should have given some context; I am in an intro math-methods-for-physics course where we take the inner product for granted and never (not even in the book do we) mention dual spaces. Therefore I found it bizarre that they insisted on referring to contra/covariance as a property of a vector itself when it seems more natural to conceive of it as a property of components. I think the course/book does a poor job at presenting the fundamental ideas behind contra/covariance.

I appreciate you emphasizing that contra/covariance can be defined without additional structure. I still find the conception of reciprocal bases as living in the same (inner product) space to be a simpler way of handling metric based problems (namely curvilinear coordinate systems). I will however keep in mind that the we can still take a dual basis (of the dual space) even on a 'bare' vector space.

Take care.
 
  • #16
Hiero said:
I still find the conception of reciprocal bases as living in the same (inner product) space to be a simpler way of handling metric based problems (namely curvilinear coordinate systems)

Unfortunately there is no wiggle room, since the reciprocal basis spans the dual space ##V^*##. Reciprocal basis does not span ##V##, since in order to span a space the elements must be members of that space in the first place 😜

By the way, @vanhees71 recently posted a very nice explanation of some of the points discussed above here:
https://www.physicsforums.com/threads/dual-vector.994347/post-6399833
 
  • #17
etotheipi said:
Unfortunately there is no wiggle room, since the reciprocal basis spans the dual space ##V^*##. Reciprocal basis does not span ##V##, since in order to span a space the elements must be members of that space in the first place 😜
We’re just using language in different ways.

What you call the ‘reciprocal basis‘ I call the ‘dual basis.’

What I called ‘reciprocal basis’ is the (inner-product dependent) basis over V.

(In fact, I found other sources online which use the phrase ‘reciprocal basis‘ in the exact same sense as me. I think ‘dual basis’ is the more common way to refer to the basis in dual space.)

I’ll read Vanhees post later, thanks.

Take care.
 
  • #18
So long as you realize that what you're thinking about is a set of vectors in ##V## which are isomorphic to a dual basis ##\{\omega^i \}## of ##V^*##, via. the isomorphism from ##V \rightarrow V^*## which maps ##u \rightarrow f_u## s.t. ##f_u(v) = \langle u, v \rangle \, \forall v \in V##. You can define this so long as ##V## is a finite inner-product space.

I would, however, disagree with your naming. AFAIK 'dual basis' and 'reciprocal basis' are synonymous, and denote a basis of ##V^*##. As such, you should not use this terminology to denote the set of vectors in ´##V## which are isomorphic to the ##\{\omega^i \}## basis of ##V^*##.
 
  • #19
etotheipi said:
So long as you realize that what you're thinking about is a set of vectors in ##V## which are isomorphic to a dual basis ##\{\omega^i \}## of ##V^*##, via. the isomorphism from ##V \rightarrow V^*## which maps ##u \rightarrow f_u## s.t. ##f_u(v) = \langle u, v \rangle \, \forall v \in V##. You can define this so long as ##V## is a finite inner-product space.
That’s one way to look at it. But that’s not the only way to look at it. The intermediate step is unnecessary if we already have an inner product. The reciprocal basis (on V) can be defined via the inner product with no reference to V* at all.

etotheipi said:
I would, however, disagree with your naming. AFAIK 'dual basis' and 'reciprocal basis' are synonymous, and denote a basis of ##V^*##.
No fair, why do you get two names and I get zero! Lol I don’t know what’s standard but I am just saying, I am not the only one to refer to the alternate V basis as the ‘reciprocal basis.’ (I mean it deserves some kind of name! Or maybe I‘ll call it the nameless basis.)

Vanhees calls the dual basis the ‘co-basis’ so actually it has three names! Save some names for the rest of the class!
 
  • #20
Hiero said:
The reciprocal basis (on V) can be defined via the inner product with no reference to V* at all.

No! The dual/reciprocal basis spans ##V^*##, not ##V##! What you are talking about is not the dual/reciprocal basis, so please do not call it that!

Hiero said:
No fair, why do you get two names and I get zero! I don’t know what’s standard but I am just saying, I am not the only one to refer to the alternate V basis as the ‘reciprocal basis.’ (I mean it deserves some kind of name! Or maybe I‘ll call it the nameless basis.)

Vanhees calls it the co-basis so actually it has three names! Save some names for the rest of the class!

Dual/reciprocal/co-basis are all synonyms in my book!
 
  • Like
Likes weirdoguy
  • #21
etotheipi said:
No! The dual/reciprocal basis spans ##V^*##, not ##V##! What you are talking about is not the dual/reciprocal basis, so please do not call it that!
Whatever. My point still stands: the NAMELESS BASIS can be defined without reference to V* if V is an inner product space.
 
  • Sad
Likes weirdoguy
  • #22
etotheipi said:
So long as you realize that what you're thinking about is a set of vectors in ##V## which are isomorphic to a dual basis ##\{\omega^i \}## of ##V^*##, via. the isomorphism from ##V \rightarrow V^*## which maps ##u \rightarrow f_u## s.t. ##f_u(v) = \langle u, v \rangle \, \forall v \in V##. You can define this so long as ##V## is a finite inner-product space.

I would, however, disagree with your naming. AFAIK 'dual basis' and 'reciprocal basis' are synonymous, and denote a basis of ##V^*##. As such, you should not use this terminology to denote the set of vectors in ´##V## which are isomorphic to the ##\{\omega^i \}## basis of ##V^*##.
I'd also not call it "reciprocal basis", because this name is given in a completely different context in condensed matter theory, where you define a unit cell of a periodic crystal (in usual configuration space, the Wigner-Seitz cell) and then define the corresponding "allowed wave numbers" via the "unit cell" of the reciprocal lattice in momentum space (Brillouin zone).

https://en.wikipedia.org/wiki/Wigner–Seitz_cell
https://en.wikipedia.org/wiki/Brillouin_zone
 
  • Like
Likes etotheipi
  • #23
Hiero said:
Whatever. My point still stands: the NAMELESS BASIS can be defined without reference to V* if V is an inner product space.
As I tried to explain in the above quoted posting, if you have a vector space without additional extra features, particularly not one with a fundamental symmetric bilinear form, i.e., with a scalar product or a pseudo-scalar product. There is no canonical isomorphism between ##V## and ##V^*##. Then it is not very useful to identify ##V## with ##V^*##, because to do so you need to refer to a basis and the whole beauty of the concept of vectors and dual vectors as basis-independent objects is spoiled. Note that the dual of the dual-vector space is canonically isomorphic with ##V## since it's easy to give a canonical isomorphism between ##V## and ##V^{**}##. It's a very nice exercise to think a while about, how this "natural" canonical isomorphism can be constructed. It's in the sense of this isomorphism that one usually identifies ##V^{**}## with ##V##.

If you have a fundamental form, there's a canonical isomorphism between ##V## and ##V^*## and then it's useful to identify ##V## and ##V^*## by this canonical isomorphism induced by the fundamental bilinear form.

In practice you rarely need this formal considerations, but it's good to have thought about them and keep in mind that tensors (including scalars and vectors as special cases) are invariant objects, i.e., independent of any choice of basis, which is arbitrary and thus can just be chosen for each problem by convenience without changing in any way the physical meaning of the description.
 
  • Like
Likes etotheipi

1. Is a vector itself contravariant or covariant?

A vector itself is neither contravariant nor covariant. These terms refer to how the components of a vector transform under a change of coordinates.

2. What does it mean for a vector to be contravariant?

A vector is contravariant if its components change in the opposite direction as the coordinates change. In other words, if the coordinates increase, the components decrease, and vice versa.

3. What does it mean for a vector to be covariant?

A vector is covariant if its components change in the same direction as the coordinates change. In other words, if the coordinates increase, the components also increase, and vice versa.

4. Can a vector be both contravariant and covariant?

No, a vector cannot be both contravariant and covariant. It is either one or the other, depending on how its components transform under a change of coordinates.

5. How do contravariant and covariant vectors relate to each other?

Contravariant and covariant vectors are related by the metric tensor, which describes the relationship between the coordinates and the components of a vector. The metric tensor is used to convert between contravariant and covariant vectors.

Similar threads

  • General Math
Replies
5
Views
1K
Replies
24
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
2K
Replies
16
Views
3K
  • Advanced Physics Homework Help
Replies
13
Views
3K
  • Linear and Abstract Algebra
Replies
9
Views
210
Replies
16
Views
2K
  • General Math
Replies
3
Views
1K
  • General Math
Replies
27
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
310
Back
Top