Show that the metric tensor is independent of coordinate choice

Click For Summary
The discussion focuses on proving that the metric tensor, denoted as g_{hk}, is independent of the choice of coordinate system. The user suggests using properties from linear algebra, particularly orthogonality, to establish this independence. They explore the transformation of coordinates and the application of the chain rule to relate derivatives across different systems. The conversation progresses with suggestions on how to express these relationships mathematically, leading to a successful demonstration of the desired result. Ultimately, the proof confirms that the metric tensor remains invariant under coordinate transformations.
PrecPoint
Messages
15
Reaction score
3
Homework Statement
Let [itex]\overline{x^j}, \overline{\overline{x^j}} [/itex] denote the coordinates of an arbitrary point P of [itex]E_n[/itex] referred to two distinct rectangular coordinate systems. An arbitrary curvlinear system in [itex]E_n[/itex] is related to the two rectangular systems according to:
Relevant Equations
[itex]\overline{x^j}=\overline{x^j}(x^h),\qquad \overline{\overline{x^j}}=\overline{\overline{x^j}}(x^h)[/itex]

Show that:

[tex]\frac{\partial{\overline{x^j}}}{\partial{x^h}} \frac{\partial{\overline{x^j}}}{\partial{x^k}}=\frac{\partial{\overline{\overline{x^j}}}}{\partial{x^h}} \frac{\partial{\overline{\overline{x^j}}}}{\partial{x^k}}[/tex]
I need to use some property of the relalation between the coordinate systems to prove that g_{hk} is independent of the choice of the underlying rectangular coordinate system.

I will try to borrow an idea from basic linear algebra. I expect any transformation between the rectangular systems to be orthogonal and hence should be able to use orthogonality. To illustrate my idea, I expect something along these lines (in linear algebra pseudocode):

\overline{\overline{x}}=A\overline{x},\quad A^TA=I,\quad (AJ)^T(AJ)=J^TA^TAJ=J^TJ

Lets try:

\overline{x}^j=\frac{\partial{\overline{x^j}}}{\partial{\overline{\overline{x^m}}}}\overline{\overline{x^m}}

Now the part where I get stuck

\frac{\partial{\overline{x^j}}}{\partial{x^h}}=\frac{\partial{^2\overline{x^j}}}{\partial{x^h}\partial{\overline{\overline{x^m}}}}\overline{\overline{x^m}}+\frac{\partial{\overline{x^j}}}{\partial{\overline{\overline{x^m}}}}\frac{\partial{\overline{\overline{x^m}}}}{\partial{x^h}}

And likewise for the other factor. But a) I am very unsure of the derivates (this is my first tensor problem) and b) there is no easy identity- or delta quantity to be found. I suspect I am on the wrong track :(

Edit: btw, this is not a homework problem, but posted here anyway since there were no other suitable place to be found. The problem is from the book on Tensors by Lovelock and Rund
 
Last edited by a moderator:
Physics news on Phys.org
Welcome to PF!

Suggestions:

Use ##\overline{\overline{x}}=A\overline{x}## to show that ##\large \frac{\partial{\overline{\overline{x^j}}}}{\partial{\overline{x^r}}}## is a particular matrix element of ##A##.

Use the chain rule to express ##\large \frac{\partial{\overline{\overline{x^j}}}}{\partial{x^h}} ## in terms of partial derivatives of ##\overline{\overline{x^j}}## with respect to the ##\overline{x^r}##'s and the partial derivatives of the ##\overline{x^r}##'s with repsect to ##x^h##
 
  • Like
Likes PrecPoint
Thank you very much TSny!

Using your suggestions I first thought about "matrix A" which should be:

A_{jr}=\frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^r}}}

Now, using my analogy A^TA=I (the transformation is orthogonal) we note that:

\frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^r}}}\frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^t}}}=\delta_{rt}

\frac{\partial\overline{\overline{x^j}}}{\partial{x^h}}=<br /> \frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^r}}}\frac{\partial\overline{x^r}}{\partial{{x^h}}},\quad \frac{\partial\overline{\overline{x^j}}}{\partial{x^k}}=<br /> \frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^t}}}\frac{\partial\overline{x^t}}{\partial{{x^k}}}<br />

Applying this to the RHS of the original statement:

\frac{\partial\overline{\overline{x^j}}}{\partial{x^h}}\frac{\partial\overline{\overline{x^j}}}{\partial{x^k}}=\frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^r}}}\frac{\partial\overline{x^r}}{\partial{x^h}}\frac{\partial\overline{\overline{x^j}}}{\partial{\overline{x^t}}}\frac{\partial\overline{x^t}}{\partial{x^k}}=\delta_{rt}\frac{\partial\overline{x^r}}{\partial{x^h}}\frac{\partial\overline{x^t}}{\partial{x^k}}=\frac{\partial\overline{x^t}}{\partial{x^h}}\frac{\partial\overline{x^t}}{\partial{x^k}}

Which is what we wanted.
 
Looks good!
 
So is there some elegant way to do this or am I just supposed to follow my nose and sub the Taylor expansions for terms in the two boost matrices under the assumption ##v,w\ll 1##, then do three ugly matrix multiplications and get some horrifying kludge for ##R## and show that the product of ##R## and its transpose is the identity matrix with det(R)=1? Without loss of generality I made ##\mathbf{v}## point along the x-axis and since ##\mathbf{v}\cdot\mathbf{w} = 0## I set ##w_1 = 0## to...

Similar threads

Replies
5
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
17
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K