# A little help understanding tensors.

Not really a homework problem. Need some help understanding tensors.

Ok, so the chapter in the book I am using, Vector Calculus by Paul C. Matthews introduces first the coordinate transformation and proceeds to say that a vector is anything which transforms according to the rule $$v_i'=L_{ij}v_j$$. That's fair and I see why this is so. Later on, it introduces a tensor and says: A quantity is a tensor if each of the free suffices transforms according to the rule $$x_i'=L_{ij}x_j$$. "For example, consider a quantity that $$T_{ij}$$ that has two free suffices. This quantity is a tensor if its components in the dashed frame are related to those in the undashed frame by the equation: $$T_{ij}'=L_{ik}L_{jm}T_{km}$$." I don't really understand what's in the quotations.

It will be nice if somebody could explain.

Thanks.

Last edited:

Related Calculus and Beyond Homework Help News on Phys.org
Dick
Homework Helper
It's a definition. The best way to understand a definition is to find examples of the things being defined. A vector is just a special case of a tensor. If v is a vector, then $$T_{ij}=v_{i} v_{j}$$ is a rank two tensor. Can you prove this?

I'm a bit confused with the suffixes, but here goes:

$$T_{ij}'=v_i'v_j'=L_{ij}v_iL_{jk}v_j=L_{ij}L_{jk}v_iv_j=L_{ij}L_{jk}T_{ij}$$

Is that correct?

Dick
Homework Helper
The indices are messed up.
$$x_i ' =L_{ij}x_j$$
is right.

$$x_i ' =L_{ij}x_i$$
is wrong. Do you see the difference? The repeated index is summed over. It's the Einstein summation convention. It's a dummy. It can't appear alone on one side of the equation and repeated on the other side.

Hmm alright, let me do this again,

$$(v_iv_j)'=L_{ik}L_{jm}v_kv_m= L_{ik}L_{jm}T_{km}$$

Is that correct then?

Thanks a lot.

Dick
Homework Helper
That's better.

Fredrik
Staff Emeritus
Gold Member
...anything which transforms according to the rule...
There's nothing in physics that I hate more than that definition, except maybe Goldstein's book on classical mechanics. Of course one of the reasons I hate his book is that he uses this definition. I hate the fact that neither the books nor the teachers ever make it clear that this definition of a tensor with k indices only makes sense after we have

a) specified a vector space V
b) explained how to associate a basis of that vector space with each coordinate system
c) associated k·n functions with each coordinate system (there are of course infinitely many)

They also never make it clear what the word "anything" in the definition can refer to (it's not even immediately clear if a cow can be a tensor), or even what it means for something to "transform". Sometimes they don't even make it clear under what condition this transformation is supposed to happen. It's like they're deliberately trying to hide the fact that the "tensor" defined this way is a set of infinitely many (k·n·∞) functions, where n=dim V.

I would prefer to start with one basis, specify k·n functions, and then use the tensor transformation law to define k·n new functions for each basis. This at least makes it clear that it's the set of k·n·∞ functions that we call a tensor.

But that definition is really dumb too compared to the modern one. See e.g. this post.

Don't take any of this as criticism against you. I have no doubt that your books and teachers are using that horrible definition. It's just that I find this definition so annoying that it sets me off on a rant everytime I see it. It caused me a lot of frustration during my third year at the university (it didn't help that the teachers didn't really understand the definition), and it upsets me to see that it's still being used.

I learned the modern definition from Schutz and Wald.