What are Good Books on Tensors for Understanding Einstein's Field Equation?

Click For Summary
The discussion centers around recommendations for books on tensors, particularly for understanding Einstein's Field Equation. Participants suggest various resources, including Pavel Grinfeld's "Introduction to Tensor Analysis and the Calculus of Moving Surfaces" and John Lee's "Riemannian Geometry." They emphasize the importance of context, noting that a physics perspective may require different texts than those focused on mathematics or engineering. Additionally, the conversation touches on abstract index notation and the complexity of tensor notation, highlighting the need for foundational knowledge in vector calculus. Overall, the thread provides a mix of book suggestions and insights into the challenges of learning tensor calculus.
Alaindevos
Messages
3
Reaction score
0
I'm looking for good books on Tensors.
I have "Introduction to Tensor Analysis and the Calculus of Moving Surfaces" from Pavel Grinfeld.
But i look for others.

[Mentor Note: Thread moved from the Relativity forum]
 
Last edited by a moderator:
Physics news on Phys.org
I’d say something, but I am biased. 🥸
 
  • Like
Likes Demystifier and berkeman
Then I'll say it... :wink:
1712418810549.png

https://www.amazon.com/Mathematical...attias-ebook/dp/B086H3LMZF/?tag=pfamazon01-20
 
  • Like
  • Love
Likes PhDeezNutz and DeBangis21
Alaindevos said:
I'm looking for good books on Tensors.
I have "Introduction to Tensor Analysis and the Calculus of Moving Surfaces" from Pavel Grinfeld.
But i look for others.
Maybe you can tell us the perspective you're interested in: Math, Engineering, Physics /Relativity?
 
WWGD said:
Maybe you can tell us the perspective you're interested in: Math, Engineering, Physics /Relativity?
If I was not hallucinating, this was originally posted in the relativity forum. That would indicate a physics perspective.
 
Orodruin said:
If I was not hallucinating, this was originally posted in the relativity forum. That would indicate a physics perspective.
It only seems to be under "Science and Math textbooks".
 
WWGD said:
It only seems to be under "Science and Math textbooks".
Yes, it was moved.
 
Orodruin said:
Yes, it was moved.
And you liked to move it, move it.
 
  • #10
WWGD said:
And you liked to move it, move it.
Oh, I don’t have those kinds of powers any more …

But back to topic.
 
  • #11
Orodruin said:
If I was not hallucinating, this was originally posted in the relativity forum. That would indicate a physics perspective.
Good point; I did the move, and probably should have added a note to provide more context to the question. I'll add that into the OP now. :smile:
 
  • #12
WWGD said:
And you liked to move it, move it.
I've tried several ways to parse this, but no joy so far. Not reverse-Polish, not Yoda-speak, not a single character typo that I can find... I'm getting dizzy. o0)
 
  • #13
berkeman said:
I've tried several ways to parse this, but no joy so far. Not reverse-Polish, not Yoda-speak, not a single character typo that I can find... I'm getting dizzy. o0)
You and my therapist, who's jumped out of his window a few times, coincidentally in the middle of our sessions.
 
  • Haha
Likes pinball1970
  • #14
WWGD said:
You and my therapist, who's jumped out of his window a few times, coincidentally in the middle of our sessions.
But you know it's a first floor office, so you've never been impressed by this therapy tactic... :smile:
 
  • #16
Some context. I just want to understand "Einsteins Field equation".
But as it is formulated in the form of tensors, i need to understand tensors.
Or abstract index notation ?
[ PS : I had vector calculus at University, but that's a different beast as the space was Euclidean]
 
  • #17
Alaindevos said:
Some context. I just want to understand "Einsteins Field equation".
But as it is formulated in the form of tensors, i need to understand tensors.
Or abstract index notation ?
[ PS : I had vector calculus at University, but that's a different beast as the space was Euclidean]
John Lee's book on Riemannian Geometry may also be helpful .https://link.springer.com/book/10.1007/978-3-319-91755-9
 
  • Like
Likes jbergman and Alaindevos
  • #18
Alaindevos said:
Some context. I just want to understand "Einsteins Field equation".
But as it is formulated in the form of tensors, i need to understand tensors.
Or abstract index notation ?
[ PS : I had vector calculus at University, but that's a different beast as the space was Euclidean]
In that case, almost any introductory GR book will provide enough of a grounding for you. I learned from Sean Carroll's online lecture notes originally, but you probably want to look at multiple sources. That one has the advantage of being free.
 
  • #19
Alaindevos said:
Some context. I just want to understand "Einsteins Field equation".
But as it is formulated in the form of tensors, i need to understand tensors.
Or abstract index notation ?
Abstract index notation is just a notational convention within tensor calculus/differential geometry. It is not a separate subject in itself.

Alaindevos said:
[ PS : I had vector calculus at University, but that's a different beast as the space was Euclidean]
To quote Yoda:
"No! No different! Only different in your mind."

The way I like to introduce things like the covariant derivative, metric tensor, Christoffel symbols, etc., is to do it in Euclidean space using curvilinear coordinates. This makes the geometrical interpretation clearer and more grounded in what people are familiar with. It makes things a bit easier once you move on to general curved spaces to have that kind of intuition built. Then you can focus on the particulars like more general connections, curvature, etc.
 
  • Like
Likes PhDeezNutz and Alaindevos
  • #20
I currently own the book "Introduction to Riemannian Manifolds" (John Lee).
But it's above my head. I cannot read it. Understand all the symbols.
Nabla_underscore_X - nabla_underscore_Y - [X,Y]
The last i think is a commutator.
I need an introduction to this book ...
 
  • #21
Alaindevos said:
Understand all the symbols.
What do you mean by this? If you understand what all the symbols mean, getting the meaning should not be very difficult.

Note that you can also use the LaTeX features of the forum to post more readable equations, eg,

$$
T(X,Y) = \nabla_X Y - \nabla_Y X - [X,Y]
$$
 
  • Like
Likes dextercioby
  • #23
 
  • #25
I am a mathematician, and to me, if you already know what the tangent bundle is, and its dual the cotangent bundle, and hence know what tangent and cotangent vector fields are, then the 20 page explanation in Spivak's chapter 4, of his Comprehensive introduction to differential geometry, is as clear as it gets. Still, it is demanding and difficult, because the subject is complicated.

Basically, for each vector space V, one can construct new spaces of functions from it. One has linear functions V-->k, where k is the real numbers, and one has "multi linear" functions Vx...xV-->k, which are linear in each variable separately. These are called (point wise) "k-tensors".

Then one takes families of these spaces. Namely, at each point p of space, one considers the vector space of tangent vectors based at p. This family of tangent spaces, one for each point p, gives the "tangent bundle" of the space. Correspondingly one has the k-fold tensor bundle, a family assigning the space of k tensors at p, to each point p.

A k-tensor "field" is a choice of a k-tensor at each point p of space, This defines a function on k-vector fields. I.e. if at each point p we have chosen k vectors, then at each point p, we let our chosen k-tensor operate on the chosen k-tuple of tangent vectors, obtaining a number.

This gives us a function from k-tuples of tangent vector fields to numbers, which is characterized by the fact that it is linear in each variable separately, over the smooth functions.

I.e. a function times a function is a function, and a function times a vector field is a vector field; and if you let your tensor field act on a k-tuple of vector fields, getting a function, then multiply by a given function, you get the same result as if you first multiplied any one of the vector fields by your function, and then let the tensor field act on the resulting k-tuple.

I.e. given a tensor field, a k-tuple of vector fields, and a function, you can combine them to get a function in two ways: first multiply the function by any one of the vector fields, and then act on the resulting k-tuple of vector fields by the tensor field; or let the tensor field act first on the original k-tuple of vector fields, and then multiply by the function. The result is the same function.

The main result about tensor fields is that any such operation on k-tuples of vector fields that is (multi) linear in that way over the functions, comes from a k-tensor field.

You see why this is complicated. But I recommend spivak.

The really complicated part is the notation, because you need to assign multiple indexes to bases of k-tuples of linear functions.

But the idea is just that, given k tangent vectors at each point, a tensor assigns a number at each point. I.e., a k-tensor field takes k-vector fields to functions, just as when k=1, a (dual, or co-) vector field takes a vector field to a function.

One can also make this sound more complicated by considering a k-tensor field and an m-vector field, together as a "mixed" (m,k) [or (k,m)?] tensor field. Then letting some components of the tensor field act on some components of the vector field, is called "contracting", since some pairs of tensor and vector components are replaced by numbers. Hence contracting one pair of components of an (m,k) field, gives a smaller, (m-1,k-1) field.

In these mixed fields, the tensor components are called covariant, and the vector components are called contravariant, because their local coordinate expressions transform differently under changes of coordinates, (presumably either by the Jacobian matrix of partials or its transpose?). (A quick look at Orodruin's insight article suggests these different types of components are distinguished by upper or lower indices, in local coordinates. Then I suspect "contraction" just means multiplying a coefficient with a given upper index times one with corresponding lower index.)

Apologies if this helps no one. The physicists here are certainly better sources, but I hoped to understand it better myself by writing this paragraph.

As to the physics, one presumably has some physical phenomena which see k-tuples of tangent vectors, gobble them up, and spit out numbers.
 
Last edited:
  • #26
Well a quick look at Wikipedia shows that the output of a tensor can be more general than a function, e.g. the Riemann curvature tensor ( a slight elaboration on the formula in post #21) takes a triple of vector fields and outputs a vector field.

Namely the commutator of two vector fields is a vector field, and the covariant derivative of one vector field by another is a vector field. So given three vector fields, we can first differentiate the third by each of the first two, and then commute the results, or we can commute the first two and then differentiate the third wrt this commutator field. The results are the same iff the space is flat, i.e. not curved. I.e. space is locally euclidean iff the Riemann curvature tensor, the difference of these two procedures, is zero.

Presumably (?) one interesting aspect of this construction is that some of these individual operations are not linear over the functions, i.e. some of these derivative operations are not themselves "tensors", but the resulting combination of them is, i.e. is a tensor.

well I seem to have stated that wrong. The riemannian curvature seems to assign to two vector fields, the tensor that maps a third vector field to a vector field. I better let the experts, e.g. Orodruin, weigh in.
 
  • #27
well I was enjoying the excerpt of Biennow on amazon https://www.amazon.com/Mathematical...attias-ebook/dp/B086H3LMZF/?tag=pfamazon01-20

until it ran out just starting tensors, but discovered that physicists like tensors so much they use them to represent objects that mathematicians think of more intrinsically. E.g. the first example given there of a tensor, p.70, is a mixed (1,1) tensor that is actually just a linear transformation. But since a linear transformation is represented in coordinates by a matrix, with two indices, it can also be thought of as a tensor of type (1,1). This uses the isomorphism of the tensor space VtensorV* with the space L(V,V) of linear transformations from V to V.

I.e. one can construct a linear transformation from V to V out of a linear function f from V to k (reals), plus a given vector w in V. I.e. one can send v to f(v).w. This would send all vectors to multiples of w, but by adding up several of these, one can get any linear transformation. I.e. every linear map from V to V has form f1w1 + f2w2 +....+fnwn. This is the linear map associated to the tensor f1(tens)w1+...+fn(tens)wn, under the map V(tens)V*-->L(V,V).

One reason this is somewhat unnatural to a mathematician is the fact that it works only in finite dimensions, i.e. not all linear transformations of an infinite dimensional V to itself have this special form. I.e. this means that to me, although using tensors to represent linear maps is notationally useful, it may not help as much to understand them. Of course physicists come equipped with a lot more understanding of what is happening than do mathematicians.

The second "example" is just a tensor product v(tens)w of two vectors, living in a tensor product space which is not described at all, except the formula 2.7 at the bottom of page 71, tells you that in such a product, scalars pull out of each entry separately, i.e. tensor product of vectors is "bilinear".

This property is actually the defining characteristic of a tensor product. I.e. it is easy to come away with the impression that a tensor product is something that looks like v(tens)w, or a sum of such, but the real meaning is that it is something satisfying equation (2.7). (Oh yes, it is also 'biadditive", i.e. (u+v)(tens)w = u(tens)w + v(tens)w. This is probably on p. 72.)

This book looks indeed interesting and helpful, but I have exhausted my free page limit.
 
Last edited:
  • #28
mathwonk said:
This book looks indeed interesting and helpful, but I have exhausted my free page limit.
I think it should be pointed out that I wrote it with a physics student audience in mind. A mathematician might find it horrendously non-rigorous and the focus is more on trying to convey an understanding to how to use the concepts to model physics rather than being rigorous in the mathematical sense.

mathwonk said:
The riemannian curvature seems to assign to two vector fields, the tensor that maps a third vector field to a vector field. I better let the experts, e.g. Orodruin, weigh in.
Generally, yes, that would be the definition, which typically takes the form ##R(X,Y) Z = \ldots##. The exact usage depends on the application. In some situations you might think of it as a map from 2 vectors to a (1,1) tensor. In others better as a map from 3 vectors to a vector, etc,
 
  • #29
You are reminding me of how stunned I was when my professor asked why I was so sure, when given a function f(x), that x was the variable and f the function. Why not define x(f) to be f(x), and then f was the variable?

I.e. if F is the space of all k valued functions on the set X, then we have a pairing FxX-->k, taking the pair <f,x> to f(x). So we could consider this pairing as a mapping from X to functions from F to k, taking x to the function x:F-->k whose value at f is f(x).

So anytime we have a pairing XxYxZ-->k, we can consider it as a function from XxYxZ to k, or a function from X to functions from YxZ to k, or a function from XxY to functions from Z to k, or........

In particular, if V is a vector space and V* is its dual space of linear functions from V to k, we have a pairing VxV*-->k, so we can consider V as a space of linear functions from V* to k, i.e. there is a natural map V-->(V*)*. Here I contradict my earlier remark about naturality, since this natural map, which I usually think of as a natural isomorphism, is only isomorphic in finite dimensions.

So I guess I should think of the map V(tens)V*-->L(V,V) as equally natural, even though not always surjective.

Thank you!
 
  • #30
Ah yes, proceeding from Orodriuin's guidance, we can see what kind of tensor the Riemann curvature should be:

We need the basic insight that the map VxV-->V(tens)V sending <v,w> to v(tens)w, is bilinear, and that thus composing with any linear map out of V(tens)V yields another bilinear map out of VxV. Indeed all bilinear maps out of VxV occur this way.
Hence Bil(VxV,W) ≈ L(V, L(V,W)) ≈ L(V(tens)V,W).
Moreover, we have seen that L(X,Y) ≈ X*(tens)Y, for all X,Y.

In particular, when W = k, we have [V(tens)V]* ≈ L(V(tens)V,k) ≈ Bil(VxV,k) ≈ L(V,L(V,k)) ≈ L(V,V*) ≈ V*(tens)V*.

Hence a tensor that sends a pair of vectors bilinearly to a linear map from vectors to vectors, belongs to L(V(tens)V, L(V,V)) ≈ [V(tens)V]* (tens)L(V,V)
≈ V* (tens)V* (tens) V* (tens) V.

Thus in local coordinates it is a linear combination of
dx^j (tens) dx^k (tens) dx^l (tens) ∂/∂x^i. i.e. is a tensor of type (3,1).

Since also V*(tens)V* (tens) V* (tens)V ≈ (V(tens)V(tens)V)* (tens)V ≈ L(V(tens)V(tens)V, V) ≈ Trilin(VxVxV, V), we can regard, as Orodruin said, our Riemann tensor, i.e. our element of V*(tens)V*(tens)V*(tens)V, as taking three vectors (trilinearly) to a vector.

so for computations, basic results seem to be that L(X,Y) ≈ X*(tens)Y, and (X(tens)Y)* ≈ X*(tens)Y*. then use that the dx^j are a basis for V* and ∂/∂xi are a basis for V, when V = R^n has coordinates x^j. and apparently tensors of type (m,n) have m elements from V* and n from V.

A simpler example, the dot product, i.,e,. a Riemannian metric, is a bilinear product from two vectors to a number, hence belongs to Bil(VxV,k) ≈ (V(tens)V)* ≈ V*(tens)V*, so is a tensor of type (2,0) and a linear combination of dx^j(tens)dx^I, in local coordinates.

[oops, I used "k" both for the real numbers and as an index.]

does this scan?
 
Last edited:

Similar threads

Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 8 ·
Replies
8
Views
353
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 17 ·
Replies
17
Views
8K
  • · Replies 7 ·
Replies
7
Views
3K