Stupid question on index lowering and raising using metric

  • I
  • Thread starter smodak
  • Start date
  • #1
450
244
Original Question (Please ignore this):

I knew this when I read about it the first time a while back but can't put the two and two together anymore. :)

I understand ##\vec{a_i}\centerdot \vec{a^j}=\delta_i^j## and ##g_{ij}=\vec{a_i}\centerdot\vec{a_j}##

How does one get from there to ##\vec{a_i}=g_{ij}\vec{a^j}##?

Modified Question (Please answer this):

How can you prove that the metric tensor can be used to raise and lower indices on any tensor? How does the metric help transform between covariant and contravariant tensors? Proving this for rank one tensors should be good enough. How does the metric perform the conversion ##T_{i}=g_{ij} T{^j}##?
 
Last edited:

Answers and Replies

  • #2
PeterDonis
Mentor
Insights Author
2020 Award
33,001
11,468
I understand ##\vec{a_i}\centerdot \vec{a^j}=\delta_i^j## and ##g_{ij}=\vec{a_i}\centerdot\vec{a_j}##
Neither of these are true in general. The first is true only for basis vectors in an orthonormal basis. The second is true only for basis vectors in any coordinate basis (not necessarily orthonormal).

How does one get from there to ##\vec{a_i}=g_{ij}\vec{a^j}##?
You can't, because the latter statement is true for all vectors, whereas the two previous statements are not (see above).
 
  • #3
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,652
The first is true only for basis vectors in an orthonormal basis.
The first relation is a relation between a tangent vector basis and the corresponding covector basis. Of course you can select to use a different covector basis, but using ##\vec a^i## to denote it would then be strongly misleading in my opinion. In other words, I believe that by ##\vec a_i## and ##\vec a^i##, the OP intends the basis vectors.
 
  • #4
450
244
The first relation is a relation between a tangent vector basis and the corresponding covector basis. Of course you can select to use a different covector basis, but using ##\vec a^i## to denote it would then be strongly misleading in my opinion. In other words, I believe that by ##\vec a_i## and ##\vec a^i##, the OP intends the basis vectors.
Correct. I do.
 
  • #5
450
244
You can't, because the latter statement is true for all vectors
Ok. How do you prove that it is true?
 
  • #6
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,652
Am I correct in assuming that this is just for a coordinate basis in curvilinear coordinates on a Euclidean space? The latter equation certainly makes no sense at all if this is not the case.
(@PeterDonis : From what I gather, he is referring to the actual basis vectors, not to the vector components. The second and third relations then only makes sense if you can directly identify the tangent vector space with the cotangent vector space.)
 
  • #7
450
244
Am I correct in assuming that this is just for a coordinate basis in curvilinear coordinates on a Euclidean space? The latter equation certainly makes no sense at all if this is not the case.
(@PeterDonis : From what I gather, he is referring to the actual basis vectors, not to the vector components. The second and third relations then only makes sense if you can directly identify the tangent vector space with the cotangent vector space.)
I see that my question has created a lot of confusion. It is my fault as I have not clearly posed the question. Please ignore my original question. I want to ask the question differently:

How can you prove that the metric tensor can be used to raise and lower indices on any tensor? How does the metric help transform between covariant and contravariant tensors? Proving this for rank one tensors should be good enough. How does the metric perform the conversion ##T_{i}=g_{ij} T{^j}##?
 
  • #8
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,652
How can you prove that the metric tensor can be used to raise and lower indices on any tensor?
What do you mean by this? The metric is a type (0,2) tensor so if you contract it with a type (1,0) tensor, i.e., a tangent vector, you will obtain a type (0,1) tensor, i.e., a dual vector, by construction. The fact that the metric is invertible establihes a 1-to-1 correspondence between tangent and dual vectors and so we call them by the same symbol, but with the index up/down.
 
  • #9
450
244
What do you mean by this? The metric is a type (0,2) tensor so if you contract it with a type (1,0) tensor, i.e., a tangent vector, you will obtain a type (0,1) tensor, i.e., a dual vector, by construction. The fact that the metric is invertible establihes a 1-to-1 correspondence between tangent and dual vectors and so we call them by the same symbol, but with the index up/down.
I think I understand what you are saying. I am using the old (covariant/contravariant) terminology. Basically what you are saying that given a arbitrary contravariant vector ##T^j##, a covariant vector ##T_i## is defined as ##T_i = g_{ij} T^j##. Also, ##g^{ij} T_i = g^{ij} g_{ik} T^k = \delta^j_k T^k = T^j##.

I guess I was just trying to prove the definition. So, the question indeed is a stupid question. :)
 
  • #10
PeterDonis
Mentor
Insights Author
2020 Award
33,001
11,468
The first relation is a relation between a tangent vector basis and the corresponding covector basis.
Yes, you're right, I was being sloppy.

From what I gather, he is referring to the actual basis vectors, not to the vector components.
Yes, agreed.

(I see that the OP has actually retracted this original question, but these points are still worth clarifying for other readers of the thread, so I'm glad you did.)
 
  • #11
haushofer
Science Advisor
Insights Author
2,481
870
I think I understand what you are saying. I am using the old (covariant/contravariant) terminology. Basically what you are saying that given a arbitrary contravariant vector ##T^j##, a covariant vector ##T_i## is defined as ##T_i = g_{ij} T^j##. Also, ##g^{ij} T_i = g^{ij} g_{ik} T^k = \delta^j_k T^k = T^j##.

I guess I was just trying to prove the definition. So, the question indeed is a stupid question. :)
Yes. The metric is by definition a mapping from vector spaces to their duals.
 
  • #12
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,829
6,652
Yes. The metric is by definition a mapping from vector spaces to their duals.
Just to be clear, this is not the only requirement on the metric. This is the same as saying it is a (0,2) tensor. You will additionally need that it should be symmetric and positive definite (or non-degenerate for the case of a pseudo metric).
 
  • #13
haushofer
Science Advisor
Insights Author
2,481
870
Of course.

(non-degeneracy is not even needed, but that's a technicality you only encounter in e.g. Newton-Cartan theory or theories with Carroll-symmetries, i.e. limits of theories which do have non-degenerate metrics. I guess that depends on nomenclature)
 
  • #14
vanhees71
Science Advisor
Insights Author
Gold Member
17,080
8,182
How can you prove that the metric tensor can be used to raise and lower indices on any tensor? How does the metric help transform between covariant and contravariant tensors? Proving this for rank one tensors should be good enough. How does the metric perform the conversion ##T_{i}=g_{ij} T{^j}##?
First of all it is important to note that tensors are invariant. The components of tensors in a (pseudo-)metric space are co- or contravariant as well as the basis vectors used to define them. In the following the Einstein summation convention is always implied.

So let ##V## be a finite dimensional real vector space with a non-degenerate symmetric bilinear form ##g:V \times V \rightarrow \mathbb{R}## (pseudo-scalarproduct). Further let ##\boldsymbol{b}_j## a basis of ##V##. Now for any other basis ##\boldsymbol{b}_j'## we have a transformation matrix ##{T^j}_k## such that
$$\boldsymbol{b}_k={T^j}_k \boldsymbol{b}_j'.$$
To derive how the vector components transform we write
$$\boldsymbol{x}=x^k \boldsymbol{b}_k={T^j}_k x^k \boldsymbol{b}_j' \; \Rightarrow \; x'{}^j={T^j}_k x^k.$$
One says the vector components transform contragrediently to the basis vectors. The basis vectors are said to transform covariantly and the vector components contravariantly.

Now let's see what about the pseudo-scalarproduct. It's components are defined by
$$g_{jk}=g(\boldsymbol{b}_j,\boldsymbol{b}_k).$$
Now lets see how these transform. We have
$$g_{jk}=g(\boldsymbol{b}_j,\boldsymbol{b}_k)=g({T^l}_j \boldsymbol{b}_l',{T^m}_k \boldsymbol{b}_m')={T^l}_j {T^m}_k g_{lm}',$$
i.e. the components transform covariantly.

Since ##g## is non-degenerate the matrix ##(g_{jk})## is invertible, and we call its inverse ##(g^{jk})##. Then we can define new basis vectors
$$\boldsymbol{b}^j=g^{jk} \boldsymbol{b}_k.$$
Now you can express any vector also with respect to this new basis
$$\boldsymbol{x}=x_j \boldsymbol{b}^j = x_j g^{jk} \boldsymbol{b}_k =x^k \boldsymbol{b}_k \; \Rightarrow \; x^k=g^{jk} x_j$$
and of course also the other way around
$$x_j=g_{jk} x^k.$$
You should now prove that the ##\boldsymbol{b}^j## transform contravariantly and the ##x_j## covariantly under the change of bases. All this is not so difficult if one keeps the bookkeeping of all these upper and lower indices concisely consistent!
 
  • #15
450
244
First of all it is important to note that tensors are invariant. The components of tensors in a (pseudo-)metric space are co- or contravariant as well as the basis vectors used to define them. In the following the Einstein summation convention is always implied.

So let ##V## be a finite dimensional real vector space with a non-degenerate symmetric bilinear form ##g:V \times V \rightarrow \mathbb{R}## (pseudo-scalarproduct). Further let ##\boldsymbol{b}_j## a basis of ##V##. Now for any other basis ##\boldsymbol{b}_j'## we have a transformation matrix ##{T^j}_k## such that
$$\boldsymbol{b}_k={T^j}_k \boldsymbol{b}_j'.$$
To derive how the vector components transform we write
$$\boldsymbol{x}=x^k \boldsymbol{b}_k={T^j}_k x^k \boldsymbol{b}_j' \; \Rightarrow \; x'{}^j={T^j}_k x^k.$$
One says the vector components transform contragrediently to the basis vectors. The basis vectors are said to transform covariantly and the vector components contravariantly.

Now let's see what about the pseudo-scalarproduct. It's components are defined by
$$g_{jk}=g(\boldsymbol{b}_j,\boldsymbol{b}_k).$$
Now lets see how these transform. We have
$$g_{jk}=g(\boldsymbol{b}_j,\boldsymbol{b}_k)=g({T^l}_j \boldsymbol{b}_l',{T^m}_k \boldsymbol{b}_m')={T^l}_j {T^m}_k g_{lm}',$$
i.e. the components transform covariantly.

Since ##g## is non-degenerate the matrix ##(g_{jk})## is invertible, and we call its inverse ##(g^{jk})##. Then we can define new basis vectors
$$\boldsymbol{b}^j=g^{jk} \boldsymbol{b}_k.$$
Now you can express any vector also with respect to this new basis
$$\boldsymbol{x}=x_j \boldsymbol{b}^j = x_j g^{jk} \boldsymbol{b}_k =x^k \boldsymbol{b}_k \; \Rightarrow \; x^k=g^{jk} x_j$$
and of course also the other way around
$$x_j=g_{jk} x^k.$$
You should now prove that the ##\boldsymbol{b}^j## transform contravariantly and the ##x_j## covariantly under the change of bases. All this is not so difficult if one keeps the bookkeeping of all these upper and lower indices concisely consistent!
Wow. Fantastic Explanation. Many Thanks! I am reading a book by Pavel Grinfeld and he tries to explain similarly albeit in a different language.
 
Last edited:
  • #16
450
244
Of course.

(non-degeneracy is not even needed, but that's a technicality you only encounter in e.g. Newton-Cartan theory or theories with Carroll-symmetries, i.e. limits of theories which do have non-degenerate metrics. I guess that depends on nomenclature)
Of course.

(non-degeneracy is not even needed, but that's a technicality you only encounter in e.g. Newton-Cartan theory or theories with Carroll-symmetries, i.e. limits of theories which do have non-degenerate metrics. I guess that depends on nomenclature)
Sorry I have no idea what non-degeneracy means or why it is required in this context; help? I also do not know what Newton-Cartan theory is or what Carroll-symmetries are. But I would like to learn. Could you point me to some resources?
 
Last edited:
  • #17
haushofer
Science Advisor
Insights Author
2,481
870
Sorry I have no idea what non-degeneracy means or why it is required in this context; help? I also do not know what Newton-Cartan theory is or what Carroll-symmetries are. But I would like to learn. Could you point me to some resources?
I'm not sure if it's a good idea to dive into Newton-Cartan theory if you're just learning about diff.geometry and GR, so I apologize for bringing it up. But if you do want to learn about it, you can check my Insights article and references therein,

https://www.physicsforums.com/insights/revival-newton-cartan-theory/

"Degenerate" means "having one or more zero eigenvalues".
 
  • #18
450
244

Related Threads on Stupid question on index lowering and raising using metric

Replies
14
Views
3K
Replies
1
Views
562
Replies
2
Views
503
Replies
2
Views
636
Replies
15
Views
6K
Replies
39
Views
2K
  • Last Post
Replies
2
Views
1K
Replies
8
Views
3K
Replies
3
Views
1K
  • Last Post
Replies
2
Views
1K
Top