# A suggested operational definition of tensors

Tags:
1. Feb 28, 2015

### Will Flannery

The two tensor definitions I'm (newly) familiar with, by transformation rules, and as a map from a tensor product space to the reals, don't tell me what a tensor does, and to the best of my knowledge they don't make it apparent. So, I'm looking for an operational definition, and suggesting the following one -

Scalars are rank 0 tensors.

For tensors of higher rank we start with a n-dimensional vector space V with basis e1, e2,... en and dual covector space V* with basis e1*, e2*,... en*. Vectors and covectors are rank 1 tensors.

A tensor of rank r>1 is a means of defining a linear map from tensors of rank m to tensors of rank n, where m and n are generally < r, but not necessarily, that is, the definition doesn't require it.

A tensor of rank r>1 is a linear combination of dyadic basis tensors of rank r, where a dyadic basis tensor has the form x1.x2.x3.....xr where each xi is either a basis vector ej of V or a basis covector ej* of V*.

The product of two dyadic basis tensors x1.x2.x3.....xr and y1.y2.y3.....ys is computed by evaluating <xr,y1> and if that is not 0, evaluating <xr-1,y2>, and if that is not 0 continuing till one of the dyadic basis tensors (normally y1.y2....ys) is used up, if no 0 was produced the dyadic basis tensor that remains is the product, it's either a 1 or a dyadic basis tensor. Note when evaluating <xr-(k-1),yk> one must be a vector and one a covector, else it's an error.

The x dyadic basis tensor eats the y dyadic basis tensor till one of the nibbles is a 0 and the result is 0, or till one is used up and the result is 1 or the part of the x dyadic basis tensor that didn't get a bite (or the y dyadic basis tensor that didn't get bitten).

A tensor A maps tensor B by applying each of the dyadic basis tensors in A to each of the dyadic basis tensors in B and multiplying their coefficients when their product is not 0, and summing the resulting dyadic basis tensors to get the result of A applied to B.

I think this is how tensors work, and I think this definition does explicitly spell it out and make it clear. But I'd like to have it verified or corrected if necessary.

2. Mar 3, 2015

### DEvens

Questions:
- You seem to have declared one unknown (scalar) in terms of another unknown (rank 0 tensor). What is the definition of at least one of those?
- Same question about rank 1 tensors and vectors. What is the definition of at least one of those?
- Your definition seems to require a particular basis. Do tensors exist independent of basis? Hint: Yes. How can your definition accommodate that?
- What makes a vector a basis vector? What makes a basis vector a vector?
- What is a "dyadic basis tensor"? When you say "where a dyadic basis tensor has the form x1.x2.x3.....xr" what does the dot between the basis vectors indicate?
- Can you show that your definition allows you to write the most general tensor in this form? For example, in 3-space, a rank 2 tensor has nine components. Does your basis allow for this?
- Can you recover the transformation properties of a tensor from your definition?

3. Mar 4, 2015

### Will Flannery

1. A scalar is a number, I assume everyone knows this.
2. Everyone knows what a vector is. The point of my definition is not mathematical rigor and completeness, but understandability, and the right tradeoff between conciseness and completeness is important.
3. The definition does require a basis, then the component transformation rules must be derived to show it is basis independent.
4. Again, everyone knows what a basis vector is.
5. The dot was not defined, I now see that it is the usual tensor product, so no problem there. A dyadic tensor is a product of rank 1 tensors. You can add that the dyadic basis tensors form a basis for the tensor product vector space.
6. Sure, the definition is constructive, and you can construct any rank tensor, even a (up1, down 3, up 2, etc.) tensor.
7. Yes, it's straightforward.

Note: I wrestled with tensors for a long time, mostly just foolin, but when I got serious it still took me two weeks to figure them out, because the 2 popular definitions don't give you any idea of what they are (a way to define maps from tensor to tensor) or how they are evaluated (the <> operation is extended to dyadic basis tensors), even though they are perfectly fine definitions. So, I think my definition has merit.

Incidentally, the big problem was understanding that vectors were covariant and covectors were contravariant, because when you're only dealing with orthonormal coordinates they both transform the same way, and that's never pointed out, and no examples are ever given. So I'd do some examples and they always came out the same, and that was driving me crazy.

Another note: the definition by transformation rule, and other tensor literature, appear to me (I'm not certain here) to assume that everyone knows how to multiply multi-dimensional arrays, and I realized at some point I had no idea how to multiply multi-dimensional arrays and I still don't.

Last edited: Mar 4, 2015