# Understanding of Tensors

1. Jan 26, 2009

### DougD720

Hello Everyone,

I didn't know whether to post this here or in the Physics area. Basically i'm trying to get a good understanding of Tensors so that I can apply them to General Relativity. I'm a freshman in college and kind of been teaching myself this advanced physics since i was 14, and now that i have multivariable calculus and some ODE knowledge under my belt i'm trying to expand my knowledge from a conceptual understanding to a mathematical one.

I really just dont get how tensors work/what they are/how to convert from cartesian coords to a tensor (if thats what you do). I have a textbook from the schools library on relativity but it just assumes one has a knowledge of Tensors. I kind of vaguely get what they are (via Wikipedia) but does anyone know a book/source that introduces the concept (from a physics standpoint, because i believe a Tensor in physics is actually a Tensor-Field in Mathematics).

Any help would be appreciated!

Thanks!

PS - If this should be in the physics forum i apologize!

2. Jan 26, 2009

### Reedeegi

Re: Tensors...

I had the same problem, and then I figured out that I needed to learn some linear and multilinear algebra in order to understand them. But basically, it's a generalization of scalars, vectors, matrices, etc.

Most tensors of the utmost importance in GR that I've come across are rank 2 tensors, which are matrices. The Tensor fields in physics are somewhat like parametrized vector fields, the set of all tensors that can be applied to a point.

I would suggest either studying a lot more math and becoming acquainted with linear algebra, differential geometry, and topology before continuing, but it is possible to learn all of that concurrently with GR.

3. Jan 26, 2009

### quasar987

Re: Tensors...

I think the online book of Carroll on GR does not presuppose a knowledge of tensors. But linear algebra and calculus are probably assumed to be known, as will any GR textbook undoubtedly!

4. Jan 27, 2009

### Chalnoth

Re: Tensors...

Of course, unfortunately there are two tensors that are commonly used that are significantly worse, the connection (rank 3) and the curvature tensor (rank 4).

Anyway, if you've ever done any computer programming, then it might help to think of tensors as multi-dimensional arrays. Here are some examples:

Rank 0 tensor:
This is just a number.

Rank 1 tensor (ex. $$x^\mu$$):
This can be thought of as a vector, or as a one-dimensional array. The number of elements is the number of dimensions (in GR this would be 4: 1 time, 3 space).

Rank 2 tensor (ex. $$g_{\mu\nu}$$):
This can be thought of as a matrix, or as a two-dimensional array. Each index in the array can go through the number of dimensions.

Rank 3 tensor (ex. $$\Gamma^{\mu\nu}_\sigma$$):
This is where it starts to get a bit more difficult. A rank 3 tensor has no analog in your typical linear algebra, but in computer representation would be a three-dimensional array. It might be visualized as a cube with evenly-spaced cells, each with a number in it. Each side of the cube, in GR, would have 4 steps.

And we could go on further to rank 4 and higher tensors, though the visualization becomes understandably more difficult. So, once you've got an idea of how to represent a tensor, the next step is how to do operations with tensors. Fortunately this isn't all that difficult, particularly if you understand a bit of computer programming. There are only two operations that are ever used. The first is simple addition. For example:

$$\Gamma^{\mu\nu}_\sigma + \Gamma^{\nu\mu}_\sigma$$

All that you do here is add up the two tensors element by element. For example, one component of the above sum would be:

$$\Gamma^{01}_2 + \Gamma^{10}_2$$

...where with the numbers substituted in for the various indices, the above equation is just the sum of two numbers, two different elements of the same tensor. Note that for the sum to make any sense, all of the indices in one component of the sum must be represented in the other. If I had instead written:

$$\Gamma^{\mu\nu}_\sigma + \Gamma^{\nu\mu}_\tau$$

...then it would just be nonsense.

The second operation is contraction. With contraction, what is done is to set one upper index to a lower index. That index is then summed. One can sum over two indices of the same tensor, such as:

$$R^{\mu\nu} = R^{\mu\nu\sigma}_{\sigma}$$

...or different indices in different tensors:

$$x^\mu = g^{\mu\nu}x_\nu$$

This shows you how to write these operations down, but what do they actually mean? Well, they're just sums. Whenever you have the same index repeated in the same tensor or in one multiple of tensors, it is summed over. For example, in the simpler of the above two examples, I can rewrite it with the sum explicitly written out:

$$x^\mu = \sum_{\nu=0}^3 g^{\mu\nu}x_\nu$$

In pseudo code, this becomes:

Code (Text):
xmu = 0 //Initialize elements of xmu to zero
for mu = 0 to 3 //Loop over mu index: at each step in loop, one element of xmu is computed
for nu = 0 to 3 //Loop over nu index: this is the sum over the repeated index
xmu[mu] = xmu[mu] + g[mu][nu]*xnu[nu] //The actual summation operation
Note that this operation just performed as written is just matrix multiplication. If you understand a little bit of linear algebra, it should look familiar: $$g^{\mu\nu}$$ acts like a matrix, while $$x_\nu$$ acts like a vector. Not all such operations can be thought of in this way, but many can.

Any decent General Relativity text will take you through a number of simple examples of performing these operations so you can get the hang of them. But hopefully this post will give you a rough, rough idea of the basics.