What is the Metric Tensor and How is it Used in Tensors?

  • Thread starter Thread starter Mazulu
  • Start date Start date
  • Tags Tags
    Tensors
Click For Summary
The metric tensor is a fundamental concept in tensor analysis, defining how dot products are computed in a vector space. It is not simply the dot product of two basis vectors; rather, it provides a framework for performing these operations. The notation used by Wolfram may be misleading, as it implies a direct product of basis vectors instead of an inner product. Resources like Schutz's "A First Course in General Relativity" are recommended for a clearer understanding of tensors and their applications in relativity. Overall, the metric tensor serves as a bilinear form that relates pairs of vectors, crucial for understanding geometric and physical concepts in relativity.
  • #31
Mazulu said:
Hi Fredrick,
I was looking at the link you provided about manifolds. You said,
I recognise g at the metric tensor; but I though g(u,v) only meant that g(u,v) is a function of u and v; such a statement is very general. So why do we worry about a strict rule that g(u,v) = g(v,u), which implies that sometimes this isn't true? What am I misunderstanding?
g denotes a function. g(u,v) and g(v,u) denote numbers in its range. When u≠v, the condition g(u,v)=g(v,u) says that g takes two different members of its domain to the same number. The statement you quoted is incomplete. It should say that we require that g(u,v)=g(v,u) for all u,v in TpM.
 
Physics news on Phys.org
  • #32
The metric tensor must be symmetric because it defines the inner product. An inner product must be symmetric, i.e. a dot b must be the same as b dot a, or else this is no longer an inner product.

(At least it defines a semi-inner product since positive definiteness is not always satisfied)
 
  • #33
Fredrik said:
g denotes a function. g(u,v) and g(v,u) denote numbers in its range. When u≠v, the condition g(u,v)=g(v,u) says that g takes two different members of its domain to the same number. The statement you quoted is incomplete. It should say that we require that g(u,v)=g(v,u) for all u,v in TpM.

I had to look at wiki which says,
In the mathematical field of differential geometry, a metric tensor is a type of function defined on a manifold (such as a surface in space) which takes as input a pair of tangent vectors v and w and produces a real number (scalar) g(v,w) in a way that generalizes many of the familiar properties of the dot product of vectors in Euclidean space. In the same way as a dot product, metric tensors are used to define the length of, and angle between, tangent vectors.
So a metric tensor is:
1. a function,
2. defined on a manifold,
3. which takes as inputs a pair of tangental vectors,
4. spits out a scalar,
5. it's dot product,
6. dot product is an inner product,
7. must be symmetric g(u,v)=g(v,u),
 
  • #34
Metrics, however, do not need to be positive definite like an inner product technically does.
 
  • #35
Mazulu said:
I had to look at wiki which says,

So a metric tensor is:
1. a function,
2. defined on a manifold,
3. which takes as inputs a pair of tangental vectors,
4. spits out a scalar,
5. it's dot product,
6. dot product is an inner product,
7. must be symmetric g(u,v)=g(v,u),
OK, let's try to be really accurate here. A metric on a smooth manifold M isn't a tensor, it's a global tensor field of type (0,2). That means that it's a function that takes each point in the manifold to a tensor of type (0,2) at that point. I will denote the tensor that g associates with the point p by gp, and I will call it "the metric at p".

For each p in M, gp is a (0,2) tensor at p. Each one of these tensors (one for each point p in the manifold) is a bilinear, symmetric, non-degenerate function from TpM×TpM into ℝ.

Bilinear means that for each u\in T_pM, the maps v\mapsto g_p(u,v) and v\mapsto g_p(v,u) are both linear.

Symmetric means that for all u,v\in T_pM, we have g(u,v)=g(v,u).

Non-degenerate means that for all u\in T_pM, the map u\mapsto g(u,\cdot) is a bijection. (Here g(u,\cdot) denotes the map that takes v to g(u,v)).

Compare this with the definition of an inner product on TpM. An inner product on TpM is a bilinear, symmetric, positive definite function s:T_pM\times T_pM\to\mathbb R. Positive definite means two things: 1. For all u\in T_pM, we have s(u,u)\geq 0. 2. For all u\in T_pM, we have s(u,u) only if u=0.

As you can see, an inner product on TpM has properties very similar to the metric at p, but the requirements are not quite the same. The requirements on inner products do however imply that inner products are non-degenerate. This means that a global (0,2) tensor field that assigns an inner product gp to each p in M would be a metric. Such a metric is called a Riemannian metric. A smooth manifold with a Riemannian metric is called a Riemannian manifold. Spacetime in GR and SR is not a Riemannian manifold, because there are (for each p) lots of non-zero vectors such that g_p(u,u)=0, and even lots of vectors such that g_p(u,u)<0.

(In case you're not sure, "map" and "function" mean exactly the same thing).
 
Last edited:
  • #36
Fredrik said:
OK, let's try to be really accurate here. A metric on a smooth manifold M isn't a tensor, it's a global tensor field of type (0,2). That means that it's a function that takes each point in the manifold to a tensor of type (0,2) at that point. I will denote the tensor that g associates with the point p by gp, and I will call it "the metric at p".

For each p in M, gp is a (0,2) tensor at p. Each one of these tensors (one for each point p in the manifold) is a bilinear, symmetric, non-degenerate function from TpM×TpM into ℝ.

Bilinear means that for each u\in T_pM, the maps v\mapsto g_p(u,v) and v\mapsto g_p(v,u) are both linear.

Symmetric means that for all u,v\in T_pM, we have g(u,v)=g(v,u).

Non-degenerate means that for all u\in T_pM, the map u\mapsto g(u,\cdot) is a bijection. (Here g(u,\cdot) denotes the map that takes v to g(u,v)).

Compare this with the definition of an inner product on TpM. An inner product on TpM is a bilinear, symmetric, positive definite function s:T_pM\times T_pM\to\mathbb R. Positive definite means two things: 1. For all u\in T_pM, we have s(u,u)\geq 0. 2. For all u\in T_pM, we have s(u,u) only if u=0.

As you can see, an inner product on TpM has properties very similar to the metric at p, but the requirements are not quite the same. The requirements on inner products do however imply that inner products are non-degenerate. This means that a global (0,2) tensor field that assigns an inner product gp to each p in M would be a metric. Such a metric is called a Riemannian metric. A smooth manifold with a Riemannian metric is called a Riemannian manifold. Spacetime in GR and SR is not a Riemannian metric, because there are (for each p) lots of non-zero vectors such that g_p(u,u)=0, and even lots of vectors such that g_p(u,u)<0.

(In case you're not sure, "map" and "function" mean exactly the same thing).

Now I understand why they called Einstein a genius. There is a lot to digest here. My break is over and I don't have a good question; sorry about that. I'll take a look at this when I get home.
 
  • #37
Actually, much of tensor analysis and differential geometry was done by Riemann (although I did hear a story that Gauss did a lot in differential geometry but did not publish it). Einstein was taught a lot of differential geometry by Levi-Civita I think.
 
  • #38
Mazulu said:
There is a lot to digest here.
Yes, this stuff isn't easy. When you understand metrics and tensor fields in general, you're off to a good start, but to really understand Einstein's equation, you also need to understand connections, covariant derivatives, parallel transport, geodesics and curvature. This is much harder.

Fredrik said:
Non-degenerate means that for all u\in T_pM, the map u\mapsto g(u,\cdot) is a bijection. (Here g(u,\cdot) denotes the map that takes v to g(u,v)).
Instead of "bijection", I should have said "bijection onto T_pM^*" just to be more clear.
 
Last edited:
  • #39
In GR, a metric tensor is a matrix that looks like,
http://en.wikipedia.org/wiki/File:Metrictensor.svg
I can easily imagine this metric tensor operating on a vector to yield a new vector.

But in mathematics, the metric tensor has these properties,
So a metric tensor is:
1. a function,
2. defined on a manifold,
3. which takes as inputs a pair of tangental vectors,
4. spits out a scalar,
5. it's dot product,
6. dot product is an inner product,
7. must be symmetric g(u,v)=g(v,u),

But then you said that a metric (not a metric tensor, just a metric) is really a global tensor field. My break is over, sorry. I just want to understand what g^{\alpha \beta} and g_{\alpha \beta} really look like when the tensor math is stripped away. I'm expecting either a matrix, or an inner product or both. When I thing of the metric tensor, I think of a matrix doing something to a vector. But I can't picture a metric operating on a vector in such a way that you get an inner product or g(u,v).
 
  • #40
A metric is what I said, a tensor field. It assigns a tensor at p to each p in M. I called it "the metric at p", but perhaps I should have called it "the value of the metric at p".

It's possible to define these terms differently. You could, e.g. define a metric to be a bilinear, symmetric, non-degenerate real-valued function, and use a term like "metric tensor field" for what I called a metric.

The components of a tensor at p, in a given basis for T_pM, are the numbers you get when you have the tensor act on the basis vectors. In this case, we have g_{\mu\nu}(p)=g_p(e_\mu,e_\nu). I wouldn't say that the matrix of components is g_p, but since g_p is completely determined by those numbers and vice versa, it's only a very minor abuse of the terminology to do so.

Note that since g_p is bilinear, we have g_p(u,v)=g_p(u^\mu e_\mu,v^\nu e_\nu)=u^\mu v^\nu g_p(e_\mu,e_\nu)=u^\mu v^\nu g_{\mu\nu}(p). If you compare this to the definition of matrix multiplication: (AB)^i_j=A^i_k B^k_j (row index upstairs, column index downstairs) you will see that u^\mu v^\nu g_{\mu\nu}(p) is equal to the only component of the 1×1 matrix
\begin{pmatrix}u^0 &amp; u^1 &amp; u^2 &amp; u^3\end{pmatrix}<br /> \begin{pmatrix}<br /> g_{00}(p) &amp; g_{01}(p) &amp; g_{02}(p) &amp; g_{03}(p)\\<br /> g_{10}(p) &amp; g_{11}(p) &amp; g_{12}(p) &amp; g_{13}(p)\\<br /> g_{20}(p) &amp; g_{21}(p) &amp; g_{22}(p) &amp; g_{23}(p)\\<br /> g_{30}(p) &amp; g_{31}(p) &amp; g_{32}(p) &amp; g_{33}(p)<br /> \end{pmatrix}<br /> \begin{pmatrix}v^0\\ v^1\\ v^2\\ v^3\end{pmatrix}<br /> So if we allow ourselves to use the same notation for a vector/tensor and its matrix of components, and put an equality sign between a real number r and a 1×1 matrix (r), we have g_p(u,v)=u^Tg_p v.
 
Last edited:
  • #41
Fredrik said:
Note that since g_p is linear, we have g_p(u,v)=g_p(u^\mu e_\mu,v^\nu e_\nu)=u^\mu v^\nu g_p(e_\mu,e_\nu)=u^\mu v^\nu g_{\mu\nu}(p).

This equation is spectacular. It's like you took two vectors u and v, and you pulled there coefficients right through the metric tensor gp. In doing so, the metric tensor now only acts on the two unit vectors eμ and e\nu. But it looks like gp became a different metric tensor, gμ\nu.
 
  • #42
Mazulu said:
This equation is spectacular. It's like you took two vectors u and v, and you pulled there coefficients right through the metric tensor gp.
Yes, that's what bilinearity means. (Note that I should have said "bilinear" or "linear in each variable" where I said "linear". I corrected that in my post after you quoted it).

Mazulu said:
But it looks like gp became a different metric tensor, gμ\nu.
I don't quite understand this comment. I'm just using the definition of "components" of g_p in the basis \{e_\mu\}. Perhaps I should write them as (g_p)_{\mu\nu}, but I prefer to write them as g_{\mu\nu}(p).
 
  • #43
Note also that in that formula there is an implied summation over all mu and nu.
 
  • #44
Fredrik said:
Yes, that's what bilinearity means. (Note that I should have said "bilinear" or "linear in each variable" where I said "linear". I corrected that in my post after you quoted it).
It's bilinear because there are two input variables, u and v? Thus, g(u,v)?

I don't quite understand this comment. I'm just using the definition of "components" of g_p in the basis \{e_\mu\}. Perhaps I should write them as (g_p)_{\mu\nu}, but I prefer to write them as g_{\mu\nu}(p).
I vaguely remember that the metric tensor g is supposed to operate on one basis to get another basis. If that is true, then g won't change if I pull the vector components out.

I wouldn't have noticed the distinction between (g_p)_{\mu\nu} and g_{\mu\nu}(p). They are both metric tensors. They both have indices μ and \nu. (g_p)_{\mu\nu} refers to point p as a subscript. g_{\mu\nu}(p) is a function of p.
 
  • #45
Bilinear means that the function g is linear in both terms. I.e. g(u+v,w)=g(u,w)+g(v,w)
and g(u,v+w)=g(u,v)+g(u,w), and g(a*u,v)=g(u,a*v)=a*g(u,v) where a is a number.
 
  • #46
Matterwave said:
Bilinear means that the function g is linear in both terms. I.e. g(u+v,w)=g(u,w)+g(v,w)
and g(u,v+w)=g(u,v)+g(u,w), and g(a*u,v)=g(u,a*v)=a*g(u,v) where a is a number.
That makes sense; I'm tracking. It's amazing that bilinearity works, not just for scalars like "a", but for vectors like uμ as well. I'm just referring to the recent quote,
gp(u,v)=gp(uμeμ,vνeν)=uμvνgp(eμ,eν)=uμvνgμν(p).
 
  • #47
Fredrik said:
\begin{pmatrix}u^0 &amp; u^1 &amp; u^2 &amp; u^3\end{pmatrix}<br /> \begin{pmatrix}<br /> g_{00}(p) &amp; g_{01}(p) &amp; g_{02}(p) &amp; g_{03}(p)\\<br /> g_{10}(p) &amp; g_{11}(p) &amp; g_{12}(p) &amp; g_{13}(p)\\<br /> g_{20}(p) &amp; g_{21}(p) &amp; g_{22}(p) &amp; g_{23}(p)\\<br /> g_{30}(p) &amp; g_{31}(p) &amp; g_{32}(p) &amp; g_{33}(p)<br /> \end{pmatrix}<br /> \begin{pmatrix}v^0\\ v^1\\ v^2\\ v^3\end{pmatrix}<br />
A metric tensor is supposed to function as an inner product of two vectors. Is this what that looks like?
 
  • #48
u^mu as used by Fredrik is not a vector, it's a vector component (the mu'th component of the vector u). You cannot simply pull vectors out, that wouldn't make sense.

The problem is with notation. A lot of authors use u^mu as the notation for a vector. In that case, it's hard to distinguish between vectors and vector components, but it's generally more convenient than always sticking with correct vector notation.
 
  • #49
Matterwave said:
u^mu as used by Fredrik is not a vector, it's a vector component (the mu'th component of the vector u). You cannot simply pull vectors out, that wouldn't make sense.

The problem is with notation. A lot of authors use u^mu as the notation for a vector. In that case, it's hard to distinguish between vectors and vector components, but it's generally more convenient than always sticking with correct vector notation.

I totally agree. I was going to tell you just what you told me, but I had to get back to work.

Hey, guess what! I just got my book, written by Bernard Shutz, called A First Course in General Relativity. In just a few months, I'll be able to build my own black hole! Hurray!:biggrin:
 
  • #50
Mazulu said:
It's bilinear because there are two input variables, u and v? Thus, g(u,v)?
Matterwave answered this.

Mazulu said:
I vaguely remember that the metric tensor g is supposed to operate on one basis to get another basis.
This is wrong. Maybe you're thinking of the relationship between a basis for T_pM and its dual basis, which is a basis for T_pM^*. This was covered in detail earlier in this thread. In post #17, I showed that g(g^{\alpha\gamma}e_\gamma,\cdot)=e^\alpha. The metric is normally not involved in a change of basis. There's a basis associated with each coordinate system, so if you just choose to use another coordinate system, that changes the basis.

If \{e&#039;_\mu\} is another basis for T_pM, then there's a matrix M such that e&#039;_\mu=M^\nu_\mu e_\nu. So the components of g_p in that basis are g&#039;_{\mu\nu}(p)=g_p(e&#039;_\mu,e&#039;_\nu)=g_p(M_\mu^\rho e_\rho,M_\nu^\sigma e_\sigma)=M_\mu^\rho M_\nu^\sigma g_p(e_\rho,e_\sigma)=M_\mu^\rho M_\nu^\sigma g_{\rho\sigma}(p) If we denote the matrix of components of g_p in these two bases by g_p and g_p&#039;, then this result can also be written as g_p&#039;=M^T g_p M.
Mazulu said:
A metric tensor is supposed to function as an inner product of two vectors. Is this what that looks like?
I'm not sure what you mean by "looks like", but for any inner product on a finite-dimensional vector space over ℝ, there's a symmetric matrix S such that \langle x,y\rangle=x^TSy. However, every symmetric matrix S doesn't define an inner product this way. I think it also needs to be such that there exists another symmetric matrix T such that S=T2. This requirement is certainly sufficient, but I haven't thought about it enough to be sure that it's necessary.
 
  • #51
I've been studying vectors, vector algebra and basis vectors from the Bernard Schutz book I bought. Last night, I had home made pizza for dinner, and left overs for breakfast; so you ask, what does this have to do with tensor calculus? Well this morning I was looking at an equation in the book,
\Lambda^{\overline{β}}_\alpha=\Lambda^\bar{\beta}_{α} (v).

But there was an extra prime in the equation that didn't make sense. That is, until I noticed that it wasn't a prime at all. It was a tiny peace of charred crust that had broken off from the pizza and landed right on the equation.
 
Last edited:
  • #52
Yeah, pizzas are making it much more difficult to learn tensors. :smile:
 
  • #53
I happen to write with a pen which is contrary to what my kindergarten teacher told me about 37 years ago. She said that math is done with a pencil so you can erase if you make a mistake. I don't remember her name, but I do know that she didn't know tensors. Erasers are a significant source of primes, hats, bars and other tiny little marks that can mess with our tensors.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
856
  • · Replies 7 ·
Replies
7
Views
780
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
Replies
19
Views
2K