Question on generalized inner product in tensor analysis

  • Thread starter mnb96
  • Start date
  • #1
713
5
Hello,

some time ago I read that if we know the metric tensor [itex]g_{ij}[/itex] associated with a change of coordinates [itex]\phi[/itex], it is possible to calculate the (Euclidean?) inner product in a way that is invariant to the parametrization. Essentially the inner product was defined in terms of the metric tensor as: [tex]g_{ij}a^i b^j[/tex] using Einstein notation (see here).

I understand Einstein summation convention, but the problem that I have here is that this formula looks totally ambiguous to me. In fact, the article speaks about curvilinear coordinates, thus the metric tensor [itex]g_{ij}[/itex] is inevitably position-dependent. When I see [itex]g_{ij}[/itex] I interpret it as [itex]g_{ij}(u_1,\ldots,u_n)=g_{ij}(\mathbf{u})[/itex]. The formula above does not say *what* coordinates we have to plug into u.

I would really like to see how someone uses the formula above to calculate the inner product between two vectors in ℝ2 expressed in polar coordinates: [itex](r_1,\theta_1)=(1,0)[/itex] and [itex](r_2, \theta_2)=(1,\frac{\pi}{2})[/itex]. The result should be 0.
 

Answers and Replies

  • #2
WannabeNewton
Science Advisor
5,815
543
The metric tensor assigns an inner product ##g_p:T_pM\times T_pM\rightarrow \mathbb{R}## for each ##p\in M## and varies smoothly from point to point so in this sense yes it is position dependent. It is a coordinate independent object and you should think of it as such i.e. ##g_{p}(u,v), u,v\in T_pM## is not dependent on what chart you choose containing ##p##.

What you should realize is that when you write ##g_{ij}##, it is the coordinate representation of the metric tensor ##g## with respect to some smooth chart hence it is already determined what coordinates you are to be using for your computation. To be more precise, if ##p\in M## and ##(U,\varphi)## is a chart containing ##p## then the coordinate representation of ##g_{p}## is given by ##g_{ij} = g_p(\frac{\partial }{\partial x^{i}}|_p, \frac{\partial }{\partial x^{j}}|_p)##. If ##(V,\phi)## is another chart containing ##p## then we get another coordinate representation ##g'_{ij} = g_p(\frac{\partial }{\partial x'^{i}}|_p,\frac{\partial }{\partial x'^{j}}|_p)## and we know that under the transition map, these two representations are related by ##g'_{ij} = \frac{\partial x^{k}}{\partial x'^{i}}\frac{\partial x^{l}}{\partial x'^{j}}g_{kl}##.

As for the computation you were interested in, this is a mistake my special relativity professor loved to bring up. In polar coordinates, you must specify where the vectors are located, not just their components in the polar representation! You cannot use the metric tensor without specifying the positions of the vectors. I had originally written up the issue but there is a whole thread regarding this commonly made mistake: https://www.physicsforums.com/showthread.php?t=621707
 
Last edited:
  • #3
713
5
Hi WannabeNewton!

Thanks, your explanation was very clear. Basically the metric defines (smoothly) a field of local inner products that are different at each location. Thus only vectors belonging to the same tangent space (i.e. originating from the same location) can be fed to the local inner product operator. This is clear.

I understand now what went wrong in example, but there is still something that bothers me:

1) In the old thread that you mentioned, in one of the last posts, the author gordon831 seems to have addressed the problem and he/she claimed to have successfully obtained the correct result with an example analogous to the one I made.

It seems to me, that gordon831 essentially took one of the two vectors and he/she performed a parallel transport to the location of the other vector, then he/she re-expressed the transported vector as linear combination of the local frame of the other vector and got a correct result.
Does this trick always (and only!) work when the manifold is flat?

2) These sort of misunderstandings mostly come from the fact that when working with ordinary vector spaces (and Euclidean vectors) students are always taught to visualize vectors as entities "originating" from 0 and pointing to a specific location. However when teachers and authors of books start speak about manifolds, they don't clearly say what happens to the old concept of position vector. They seem to silently forget about it and start talking about vector fields! This is painfully confusing for a beginner.

Should I perhaps start performing the mental exercise of reinterpreting the Euclidean space as a manifold whose tangent spaces at each location are constant and having metric equal to the identity matrix? If the answer is yes, then how do you interpret the old position vectors? You interpret them as vectors in this manifold that are placed at arbitrary locations? or strictly at the origin?
 
  • #4
WannabeNewton
Science Advisor
5,815
543
All gordon did was express the vectors in terms of the frame at an arbitrary but common point for both vectors; it works because we are just converting from cartesian to polar in flat space so yeah.

As for your second question, I really don't know if that mental exercise will be helpful or not. My first exposure to thinking about these things came from learning general relativity so I can't say from experience if that mental exercise will actually be useful because I never tried it.
 
  • #5
Bacle2
Science Advisor
1,089
10
Hi, mnb96:

Besides WBN ideas, maybe you can do some "down-and-dirty" calculations in the case
of an m-manifold M embedded in R^n by pulling back the standard inner-product in R^m
(seen/considered as a 2-tensor) by the individual chart maps. Maybe start with low-dimensional
cases, like pulling back the inner-product of R^1 to a curve --single chart; don't take anything
too crazy for a start. Maybe it may also help to see the argument of how any smooth manifold
can be given a Riemannian metric by pulling back the inner-product . Here you use a lot of
the issues that I think you're interested in.
 

Related Threads on Question on generalized inner product in tensor analysis

Replies
1
Views
2K
  • Last Post
Replies
12
Views
4K
  • Last Post
Replies
10
Views
3K
Replies
15
Views
4K
  • Last Post
Replies
3
Views
3K
Replies
4
Views
5K
  • Last Post
Replies
17
Views
9K
  • Last Post
Replies
1
Views
3K
  • Last Post
Replies
5
Views
4K
  • Last Post
Replies
3
Views
3K
Top