Contravariance, Covariance, stress and position vector

In summary: A vector is a vector. It's an invariant and doesn't transform under coordinate transformations. It's simply the sloppy slang of physicists that calls components of a vector simply "vector". What's meant is that these are components of a vector (which always refer to an arbitrarily chosen basis of the vector space, which is also usually not mentioned explicitly), and these transform contravariantly to the basis vectors, which transform covariantly.
  • #1
Trying2Learn
373
57
TL;DR Summary
Is there an error in wikipedia
Good Morning!

I understand that a vector is a physical object
I understand that it is the underlying basis that determines how the components transform.

However, I encounter this:
https://en.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors

The fifth paragraph has this statement
A contravariant vector or tangent vector (often abbreviated simply as vector, such as a direction vector or velocity vector) has components that contra-vary with a change of basis to compensate. That is, the matrix that transforms the vector components must be the inverse of the matrix that transforms the basis vectors. The components of vectors (as opposed to those of covectors) are said to be contravariant.

I am confused.

This seems to be suggesting that of the various types of vectors, one vector, the direction vector (and, I assume velocity, acceleration) KNOWS itself to be a contravariant vector.

This makes no sense and is contrary to my understanding that a vector (or, say, even the stress tensor) does NOT have ONE manifestation (the latter could be contra, co or mixed)

Am I reading too much into the wiki statement?

Because I can use the metric tensor and convert a contravariant position vector to be one with its dual basis and covariant components.
 
Physics news on Phys.org
  • #2
A vector is a vector. It's an invariant and doesn't transform under coordinate transformations. It's simply the sloppy slang of physicists that calls components of a vector simply "vector". What's meant is that these are components of a vector (which always refer to an arbitrarily chosen basis of the vector space, which is also usually not mentioned explicitly), and these transform contravariantly to the basis vectors, which transform covariantly.

To see, how this works take ##\vec{V}## as a vector and ##\vec{b}_j## a basis. Then
$$\vec{V}=V^j \vec{b}_j$$
where ##V^j## are the components of the vector wrt. the basis ##\vec{b}_j##. Summation over repeated index pairs (one must be an upper and the other a lower index) is understood (Einstein summation convention).

Now consider another basis ##\vec{b}_k'##. You can express the old basis in terms of the new one,
$$\vec{b}_j={T^k}_j \vec{b}_k'.$$
Then you have
$$\vec{V}=V^{j} \vec{b}_j = V^{j} {T^k}_j \vec{b}_k' = V^{\prime k} \vec{b}_k'.$$
Since the components of a vector wrt. a basis are always uniquely defined, you have
$$V^{\prime k} = {T^k}_j V^{j}.$$
The basis transformation can of course also be written as
$$\vec{b}_k' ={U^j}_k \vec{b}_j,$$
and you must have
$$\vec{b}_l = {T^k}_l \vec{b}_k' = {T^k}_l {U^j}_k \vec{b}_j$$
and thus
$${U^j}_k {T^k}_l=\delta_{kl},$$
i.e., as matrices ##\hat{U}=\hat{T}^{-1}##.

One says that the vector components (with upper indices) transform contragediently to the basis vectors (with lower indices) and calls the vector components contravariant and the basis vectors covariant.

The entire game can also be played with linear forms (dual vectors). Here it's the other way around: The dual basis (upper indices) transforms contravariantly and the dual-vector components (lower indices) covariantly. Again that's all tailored such that the linear form is independent of the choice of the basis and dual basis, when working with the corresponding components, as it must be.
 
  • Like
Likes PeroK
  • #3
vanhees71 said:
A vector is a vector. It's an invariant and doesn't transform under coordinate transformations. It's simply the sloppy slang of physicists that calls components of a vector simply "vector". What's meant is that these are components of a vector (which always refer to an arbitrarily chosen basis of the vector space, which is also usually not mentioned explicitly), and these transform contravariantly to the basis vectors, which transform covariantly.

To see, how this works take ##\vec{V}## as a vector and ##\vec{b}_j## a basis. Then
$$\vec{V}=V^j \vec{b}_j$$
where ##V^j## are the components of the vector wrt. the basis ##\vec{b}_j##. Summation over repeated index pairs (one must be an upper and the other a lower index) is understood (Einstein summation convention).

Now consider another basis ##\vec{b}_k'##. You can express the old basis in terms of the new one,
$$\vec{b}_j={T^k}_j \vec{b}_k'.$$
Then you have
$$\vec{V}=V^{j} \vec{b}_j = V^{j} {T^k}_j \vec{b}_k' = V^{\prime k} \vec{b}_k'.$$
Since the components of a vector wrt. a basis are always uniquely defined, you have
$$V^{\prime k} = {T^k}_j V^{j}.$$
The basis transformation can of course also be written as
$$\vec{b}_k' ={U^j}_k \vec{b}_j,$$
and you must have
$$\vec{b}_l = {T^k}_l \vec{b}_k' = {T^k}_l {U^j}_k \vec{b}_j$$
and thus
$${U^j}_k {T^k}_l=\delta_{kl},$$
i.e., as matrices ##\hat{U}=\hat{T}^{-1}##.

One says that the vector components (with upper indices) transform contragediently to the basis vectors (with lower indices) and calls the vector components contravariant and the basis vectors covariant.

The entire game can also be played with linear forms (dual vectors). Here it's the other way around: The dual basis (upper indices) transforms contravariantly and the dual-vector components (lower indices) covariantly. Again that's all tailored such that the linear form is independent of the choice of the basis and dual basis, when working with the corresponding components, as it must be.
I am sorry, I was not clear: I do understand what you say: I undestand the transormation rules

I am, however, only concerned about the wiki statement, re-stated here (and just the bold par)

A contravariant vector or tangent vector (often abbreviated simply as vector, such as a direction vector or or velocity vector) has components that contra-vary with a change of basis to compensate. That is, the matrix that transforms the vector components must be the inverse of the matrix that transforms the basis vectors. The components of vectors (as opposed to those of covectors) are said to be contravariant.

Is this not totally wrong?

Unless! They associate a direction vector with a vector in the tangent space?
I would have deleted the RED part since, as you say, vectors do NOT KNOW variance.
This article seems to be suggesting that a position vector's components transform contravariantly.
 
  • #4
That's nonsense. A vector doesn't transform at all under basis transformations. It's not wrong, only the first part of the sentence doesn't make sense. A vector is a vector that doesn't transform at all. That holds of course also true for the tangent vector of a differentiable manifold at one of its points. The tangent-vector space at a fixed point of the manifold is just a vector space.

I think the Wiki article says the very same thing as I did but commit the usual sin of the physics literature to mix up vectors with components of vectors, leading to confusion.
 
  • #5
Trying2Learn said:
This article seems to be suggesting that a position vector's components transform contravariantly.
To take a simple example. If we have a Cartesian basis vector ##e_1## and we do a change of basis to ##e'_1 = 2e_1##. Then we take a vector ##v## represented by ##v = v^1e_1##. Under the change of basis, the vector component transforms according to ##v = v'^1e'_1##, where ##v'^1 = \frac 1 2v^1##. I.e. the components change contravariantly.

Alternatively, if you rotate a basis vector clockwise, then the components rotate anticlockwise, as it were.
 
  • #6
vanhees71 said:
That's nonsense. A vector doesn't transform at all under basis transformations. It's not wrong, only the first part of the sentence doesn't make sense. A vector is a vector that doesn't transform at all. That holds of course also true for the tangent vector of a differentiable manifold at one of its points. The tangent-vector space at a fixed point of the manifold is just a vector space.

I think the Wiki article says the very same thing as I did but commit the usual sin of the physics literature to mix up vectors with components of vectors, leading to confusion.
And that I understand, too. I am sorry about being a pest, but I am after ONE ISSUE

You said the FIRST clause of the sentence does not make sense, and this leads me to believe you are OK with the second clause

But this is the second clause (maybe you meant the whole thing?)

A contravariant vector (such as a direction vector)

Right there, cut down to the core: how does the author get away with exemplifying a contravariant vector (I understand your objections to calling vectors contravariant or covariant, but that is not the issue for me) as a direction vector?

That is my focus.
 
  • #7
Trying2Learn said:
And that I understand, too. I am sorry about being a pest, but I am after ONE ISSUE

You said the FIRST clause of the sentence does not make sense, and this leads me to believe you are OK with the second clause

But this is the second clause (maybe you meant the whole thing?)

A contravariant vector (such as a direction vector)

Right there, cut down to the core: how does the author get away with exemplifying a contravariant vector (I understand your objections to calling vectors contravariant or covariant, but that is not the issue for me) as a direction vector?

That is my focus.
The components of a position vector transform contravariantly; as do the components of any vector.

If the components transform covariantly, then the object is called a covector or dual vector.

PS that's in the introduction to the Wikipedia page you linked to.
 
  • #8
PeroK said:
The components of a position vector transform contravariantly; as do the components of any vector.

If the components transform covariantly, then the object is called a covector or dual vector.
OK, now this is the track, I think

Could I rephrase this (to satisfy by OCD -- I am not joking, I am compulsive about this)

Shouldn't you have written this? (I am not trying to be petty here, I am really confused, and this might clear it up for me)

Shouldn't you have written this?

The components of a position vector transform contravariantly; when expressed in a given basis
The components of a position vector transform covariantly; when expressed in the dual basis

In other words, why are you saying this
"The components of a position vector transform contravariantly..."

What is so special about the position vector that you jump to contravariant?
The original wikipedia article is making me think there is something special about the position vector.
 
  • #9
Trying2Learn said:
The components of a position vector transform contravariantly; when expressed in a given basis
The components of a position vector transform covariantly; when expressed in the dual basis
It's more usual to identify vectors and covectors as different objects in different spaces. The dual basis is the basis for a different vector space, containing the covectors. Then you have a mapping from vectors to covectors, but you stop short of identifying a vector and covector as different representations of the same physical thing.
 
  • #10
PeroK said:
It's more usual to identify vectors and covectors as different objects in different spaces. The dual basis is the basis for a different vector space, containing the covectors. Then you have a mapping from vectors to covectors, but you stop short of identifying a vector and covector as different representations of the same physical thing.
I am sorry, but I am not giving up
I understand that, too.

My question is really simple...

Is this clause truthful? (Yes, or no)
A contravariant vector (such as a direction vector)

It seems to me that you CANNOT assert that a direction vector is contravariant.
 
  • Skeptical
Likes PeroK
  • #11
Trying2Learn said:
It seems to me that you CANNOT assert that a direction vector is contravariant.
Well, it most definitely is. Consider the examples in post #5. Just take a piece of paper. Draw the x-y axes and a position vector anywhere. Let's say somewhere in the first quadrant. Now rotate the axes clockwise. The effect is the same as rotating the vector anticlockwise. The components of your position vector under a clockwise rotation of the basis change as they would if you kept the same basis and rotated the position vector anticlockwise. That's a simple example of contravariance.
 
  • #12
Trying2Learn said:
I am sorry, but I am not giving up
I understand that, too.

My question is really simple...

Is this clause truthful? (Yes, or no)
A contravariant vector (such as a direction vector)

It seems to me that you CANNOT assert that a direction vector is contravariant.
As I said several times, there is no such thing as a "contravariant vector". That's a non-sensical term. Only components of a vector are "contravariant" but not vectors. The vectors are, by definition, invariant under basis transformations.
 
  • #13
vanhees71 said:
As I said several times, there is no such thing as a "contravariant vector". That's a non-sensical term. Only components of a vector are "contravariant" but not vectors. The vectors are, by definition, invariant under basis transformations.

In a post

Reference: https://www.physicsforums.com/threads/contravariant-and-covariant-vectors.735287/

This was said: (see picture)

So, then he skirts the issue by using the word "natural"

I am still confused how he asserst that a position vector is contravariant.
And now I want to konw what "natural" means, with regard to the gradient (and I do understand it has contra and co forms with the metric tensor, etc.), but what does he mean by "natural" and can that address the issue of why so many people call the positio vector contravariant (and yes, I know vectors are physical objects)
 

Attachments

  • xxxx.JPG
    xxxx.JPG
    28.6 KB · Views: 70
  • #14
PeroK said:
Well, it most definitely is. Consider the examples in post #5. Just take a piece of paper. Draw the x-y axes and a position vector anywhere. Let's say somewhere in the first quadrant. Now rotate the axes clockwise. The effect is the same as rotating the vector anticlockwise. The components of your position vector under a clockwise rotation of the basis change as they would if you kept the same basis and rotated the position vector anticlockwise. That's a simple example of contravariance.
No! The vectors are invariant quantities. It's very clear from your example. Draw an arrow in a plane (that's a vector in the usual way you define a vector on an affine space) and then a Cartesian basis and determine the components of your arrow. Then draw another Cartesian basis rotated against the first. What happens to the vector (i.e., the arrow you've drawn in the beginning)? Indeed nothing! What changes are of course the components of this vector wrt. the new basis, and the transformation is given by a rotation matrix. The basis transforms covariantly, the components contravariantly.

All this also applies to more general bases than Cartesian ones in a Euclidean vector space, of course.
 
  • #15
Trying2Learn said:
In a pos

Reference: https://www.physicsforums.com/threads/contravariant-and-covariant-vectors.735287/

This was said

So, then he skirts the issue by using the word "natural"

I am still confused how he asserst that a position vector is contravariant.
And now I want to konw what "natural" means
As I said, that's just the sloppy slang of physicists, who don't distinguish between vectors (independent of any choice of basis) and components of vectors (dependent on the choice of basis they are related to). Once more a position vector (or any other vector) doesn't change at all under a change of basis but their components of course do.

The statement of "natural" must refer to components too. Indeed a gradient (or rather the components of the gradient) of a scalar field is most simply given in terms of covariant components. If you choose a coordinate basis of the tangent vectors on a manifold then derivatives with respect to the coordinates, ##q^j##, the partial derivatives ##\partial_j \Phi(q)=\frac{\partial}{\partial q^j} \Phi(q)## transform as covariant vector components do.
 
  • #16
vanhees71 said:
The vectors are invariant quantities.
Yes, but a system can be physically rotated. Then the position vector of a point in the system changes. Also, the position of a vector relative to the basis vectors is not invariant if you change the basis. That's the point.

I think I've specified vector components wherever relevant?
 
  • Like
Likes vanhees71
  • #17
Yes, but that's an "active transformation" and NOT a transformation of components due to a mere change of a basis. That's clear from the fact that "active transformations" are (often linear) operations on (invariant!) vectors, i.e., maps ##V \rightarrow V##, independent of any choice of basis.

One should also better distinguish these to types of transformations in the physics literature too!
 
  • #18
vanhees71 said:
As I said, that's just the sloppy slang of physicists, who don't distinguish between vectors (independent of any choice of basis) and components of vectors (dependent on the choice of basis they are related to). Once more a position vector (or any other vector) doesn't change at all under a change of basis but their components of course do.

The statement of "natural" must refer to components too. Indeed a gradient (or rather the components of the gradient) of a scalar field is most simply given in terms of covariant components. If you choose a coordinate basis of the tangent vectors on a manifold then derivatives with respect to the coordinates, ##q^j##, the partial derivatives ##\partial_j \Phi(q)=\frac{\partial}{\partial q^j} \Phi(q)## transform as covariant vector components do.

So, with that, are you suggesting that when we take the gradient, it INITIALLY (using frames of reference to which we are accustomed BEFORE studying contra/co) seems that the gradient components transform covariantly...

And that, (in accordance with our first learning), position vector has components that appear to transform contravariantly.

However, in the long-run, either vector can be described in terms of components that transform contra or co, (as long as we use the correct basis or dual), equipped with the metric tensor to raise or lower the indices (change the variance)

Because it seems to be that the use of the word "natural" is causing me just as much grief.
 
  • #19
The latter is true only in spaces with a fundamental form (or as a special case of a Euclidean vector space, where the fundamental form is positive definite and thus scalar product).
 
  • #20
vanhees71 said:
The latter is true only in spaces with a fundamental form (or as a special case of a Euclidean vector space, where the fundamental form is positive definite and thus scalar product).
May I ask you to check my translation of what you just wrote?

With regard to the gradient, there is a natural (or fundamental) form of the operation that suggests the components transform covariantly (but we can always deploy the metric tensor and change the variance)

There is no such fundamental form that suggests the components of a position vector transform contra or co (other than by having been accustomed to having first seen it in the form where the components transform contravariantly)
 
  • #21
If you have a vector space with a fundamental form there's a "canonical", i.e., basis-independent, bivective linear map between vectors and linear forms (i.e., an isomorphism between the vector space and its co-vector space), and thus vectors and linear forms can be identified by this isomorphism.

E.g., in Minkowski space the fundamental form (usually called "the metric" by physicists, which however is also inaccurate and sometimes confusing ;-)) has components wrt. a Lorentzian basis ##(\eta_{\mu \nu})=\mathrm{diag}(1,-1,-1,-1)##. It's inverse looks the same, ##(\eta^{\mu \nu}=\mathrm{diag}(1,-1,-1,-1)##. Then if you have contravariant components of a vector, ##V^{\mu}##, then you get the corresponding covariant components by ##V_{\mu}=\eta_{\mu \nu} V^{\nu}##. The same holds for the basis ##\vec{e}_{\mu}## (covariant) and its dual basis ##\vec{e}^{\mu}## (contravariant), and the vector is
$$\vec{V} = V^{\mu} \vec{e}_{\mu} = V_{\nu} \vec{e}^{\nu}=\eta_{\mu \nu} V^{\mu} \vec{e}^{\nu}=\cdots$$
 
  • #22
I think every response so far (please forgive me, I am frustrated) has overcomplicated the issue

I just want to know WHY the terms "contravariant" and "direction vector" appear in the same clause.
A contravariant vector (such as a direction vector)

Yes, I do understand that a vector is independent of any choice of basis and remains invariant.

However, there MUST be reason the author (of that very first wiki link) connected these two phrases.

I am now guessing it is this...
  • A differential is the ideal prototype of a contravariant tensor
  • The gradient of a potential function is the ideal prototype of a covariant tensor.

With this in mind, if we view position vectors (and their associated distances), it makes sense to FIRST view the direction vector as having contravariant components (which can be transformed using the metric tensor)

Furthermore, we view gradients as having covariant components (which can be transformed using the metric tensor), that come out naturally, when we take the gradient... as the SECOND bullet point on that original wiki page stated.

IN other words, YES: it is wrong to assert a position vector is contravariant (I accept this). However, I think this misuse of language stems from the fact that we start somewhere, and position vectors resemble the differential and that is where the corruption of the language began.

There must be a reason the author of the wiki page linked the phrases contravariant and direction vector. And, later, says this (same wiki page): Examples of covariant vectors generally appear when taking a gradient of a function.
 
Last edited:
  • Skeptical
Likes PeroK
  • #23
Trying2Learn said:
I think every response so far (please forgive me, I am frustrated) has overcomplicated the issue
There's only one person overcomplicating this!
Trying2Learn said:
IN other words, YES; it is wrong to assert a position vector is contravariant.
This is wrong, no matter how times you shout it back at us.

I'm out!
 
  • #24
Holy semantics battle, Batman! I think when you say position vector you have already implicitly confined yourself to ##\mathbb{R}^n## (or atleast embedded your manifold of interest there). Generally, a position is simply a point on a manifold and it would not make sense to discuss a position vector in that context. Beyond this small quibble, I don't really get what you are all arguing about.
 
  • #25
Trying2Learn said:
I think every response so far (please forgive me, I am frustrated) has overcomplicated the issue

I just want to know WHY the terms "contravariant" and "direction vector" appear in the same clause.
A contravariant vector (such as a direction vector)

Yes, I do understand that a vector is independent of any choice of basis and remains invariant.

However, there MUST be reason the author (of that very first wiki link) connected these two phrases.

I am now guessing it is this...
  • A differential is the ideal prototype of a contravariant tensor
  • The gradient of a potential function is the ideal prototype of a covariant tensor.

With this in mind, if we view position vectors (and their associated distances), it makes sense to FIRST view the direction vector as having contravariant components (which can be transformed using the metric tensor)

Furthermore, we view gradients as having covariant components (which can be transformed using the metric tensor), that come out naturally, when we take the gradient... as the SECOND bullet point on that original wiki page stated.

IN other words, YES: it is wrong to assert a position vector is contravariant (I accept this). However, I think this misuse of language stems from the fact that we start somewhere, and position vectors resemble the differential and that is where the corruption of the language began.

There must be a reason the author of the wiki page linked the phrases contravariant and direction vector. And, later, says this (same wiki page): Examples of covariant vectors generally appear when taking a gradient of a function.
If you don't read my really simple answers, I can't help you. One last try:

Vectors are invariant. They are independent of any choice of a basis.

The components of a vector ##\vec{V}## refer to the choice of a basis ##\vec{b}_j## and are labeled as ##V^j##. The vector components transform contragrediently to the basis vectors. One also says the basis vectors transform covariantly and the vector components contravariantly.

Then there are the linear forms, i.e., linear mappings from the vector space to the scalars. The linear forms over a vector space form a vector space themselves, the dual space. For a given basis ##\vec{b}_j## there is the dual basis, i.e., a basis in the dual space, called ##\vec{b}^j##, defined by ##\vec{b}^j(\vec{b}_k)=\delta_k^j##. Any linear form (or dual vector) ##L## is again invariant under basis transformations. The components of the linear form ##L_j## transform covariantly and the dual-basis vectors contravariantly.

Last but not least one should also read Wikipedia with care. It's an amazing piece of work and even better than very well established classical encyclopedias but it is not a scientific publication or textbook. To really learn math or any other scientific subject you should read good textbooks and/or original scientific publications (papers).
 

1. What is contravariance and covariance?

Contravariance and covariance are mathematical concepts used in vector analysis. Contravariance refers to the change in the components of a vector when the coordinate system is changed, while covariance refers to the preservation of the components of a vector when the coordinate system is changed.

2. How are contravariance and covariance related to each other?

Contravariance and covariance are two sides of the same coin. They are inversely related to each other, meaning that when one increases, the other decreases. This relationship is known as the duality principle.

3. What is stress in vector analysis?

In vector analysis, stress is a measure of the internal forces within a material that can cause it to deform or change shape. It is represented by a second-order tensor and has six components, each corresponding to a different direction in three-dimensional space.

4. How is the position vector defined?

The position vector is a mathematical representation of the location of a point in space relative to a reference point. It is typically denoted by r and is defined as the displacement from the reference point to the point of interest.

5. What is the significance of position vector in physics?

The position vector is a fundamental concept in physics and is used to describe the position, velocity, and acceleration of objects in space. It is also used in the calculation of various physical quantities, such as work, energy, and momentum.

Similar threads

  • General Math
Replies
5
Views
1K
Replies
22
Views
2K
  • Advanced Physics Homework Help
Replies
5
Views
2K
  • Classical Physics
Replies
6
Views
1K
  • Special and General Relativity
Replies
3
Views
860
  • Advanced Physics Homework Help
Replies
13
Views
3K
  • Other Physics Topics
Replies
8
Views
8K
Replies
6
Views
1K
  • Special and General Relativity
Replies
10
Views
2K
  • Special and General Relativity
Replies
8
Views
1K
Back
Top